May 27, 2024

British Ruling Pins Blame on Social Media for Teenager’s Suicide – The New York Times

The internet, according to the ruling, “affected her mental health in a negative way and contributed to her death in a more than minimal way.”
A photo album of Molly Russell, who died in November 2017.Credit…Jonathan Clifford for The New York Times
Supported by
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.

Adam Satariano, who covers technology issues from London, has talked to people involved in this case since 2020.
Sitting in the witness box of a small London courtroom this week, a Meta executive faced an uncomfortable question: Did her company contribute to the suicide of a 14-year-old named Molly Russell?
Videos and images of suicide, self-harm and depressive content that the teenager viewed in the months before she died in November 2017 appeared on a screen in the courtroom. The executive was read a post that Molly had liked or saved from Instagram, and heard how it was copied almost verbatim in a note filled with words of self-loathing later found by her parents.
“This is Instagram literally giving Molly ideas,” Oliver Sanders, a lawyer representing the family, said angrily during one moment of the exchange.
Leaning forward in the witness chair, the executive, Elizabeth Lagone, who leads the company’s health and well-being policy, responded: “I can’t speak to what was going on in Molly’s mind.”
The coroner overseeing the case, who in Britain is a judgelike figure with wide authority to investigate and officially determine a person’s cause of death, was far less circumspect. On Friday, he ruled that Instagram and other social media platforms had contributed to her death — perhaps the first time anywhere that internet companies have been legally blamed for a suicide.
“Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content,” said the coroner, Andrew Walker. Rather than officially classify her death a suicide, he said the internet “affected her mental health in a negative way and contributed to her death in a more than minimal way.”
The dispassionate and declarative judgment concluded a legal battle that pitted the Russell family against some of Silicon Valley’s largest companies. Delving into many parents’ worst fears about the influence of the internet and social media on their children, the case reverberated in Britain and beyond. A crowd of television cameras and photographers gathered outside the courtroom when the decision was announced.
Thousands of images, videos and other social media material from Molly’s accounts were revealed during the investigation, one of the largest public releases of its kind. That provided the sort of detail that researchers studying the mental health effects of social media have long complained that platforms like Meta, which owns Facebook and Instagram, withhold on privacy and ethical grounds.
Molly’s social media use included material so upsetting that one courtroom worker stepped out of the room to avoid viewing a series of Instagram videos depicting suicide. A child psychologist who was called as an expert witness said the material was so “disturbing” and “distressing” that it caused him to lose sleep for weeks.
Are you concerned for your teen? If you worry that your teen might be experiencing depression or suicidal thoughts, there are a few things you can do to help. Dr. Christine Moutier, the chief medical officer of the American Foundation for Suicide Prevention, suggests these steps:
Look for changes. Notice shifts in sleeping and eating habits in your teen, as well as any issues he or she might be having at school, such as slipping grades. Watch for angry outbursts, mood swings and a loss of interest in activities they used to love. Stay attuned to their social media posts as well.
Keep the lines of communication open. If you notice something unusual, start a conversation. But your child might not want to talk. In that case, offer him or her help in finding a trusted person to share their struggles with instead.
Seek out professional support. A child who expresses suicidal thoughts may benefit from a mental health evaluation and treatment. You can start by speaking with your child’s pediatrician or a mental health professional.
In an emergency: If you have immediate concern for your child’s safety, do not leave him or her alone. Call a suicide prevention lifeline. Lock up any potentially lethal objects. Children who are actively trying to harm themselves should be taken to the closest emergency room.
Resources If you’re worried about someone in your life and don’t know how to help, these resources can offer guidance:1. The National Suicide Prevention Lifeline: Text or call 988 2. The Crisis Text Line: Text TALK to 741741 3. The American Foundation for Suicide Prevention
The companies face no financial or other penalty because of the decision. The case was a coroner inquest to determine a cause of death, not a criminal or civil trial. The family said it had pursued the case as a form of justice for Molly and to raise awareness about youth suicide and the dangers of social media.
But already, a draft law partly inspired by Molly’s death, to force social media companies to adopt new child safety protections or risk heavy fines, is winding its way through Britain’s Parliament. Instagram and Pinterest restricted access to some suicide and self-harm content. And lawyers representing American families that are suing TikTok and Meta for contributing to their children’s deaths are pointing to the outcome as a new precedent.
“This was David and Goliath,” said Beeban Kidron, a member of the House of Lords and founder of 5Rights, a nonprofit pushing for stricter online child-safety laws. “The Russell family have battled for five years to get the companies into an environment where under oath they had to account for their actions.”
Meta, which said during the inquest that it had never studied the effects of suicidal and depressive Instagram content on its youngest users, said in a statement afterward that its “thoughts are with the Russell family” and that it was “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers.”
The Russell family had an “almost boring” life in a north London suburb, Ian Russell, Molly’s father, said in a July interview before the inquest. Worried about their three daughters’ use of technology, he and his wife, Janet, attended e-safety classes at their school and tried to keep tabs on their social media accounts. Phones were banned at the dinner table.
Molly, like her two older sisters, got a basic phone at age 11, when many British children begin commuting to school independently. She received an iPhone as a 13th birthday present, not long after she had created an Instagram account with her parents’ permission.
Molly, who had enjoyed horseback riding and pop music, began spending more time in her room, but nothing raised alarms. Mr. Russell said she had rarely posted anything publicly on social media, but it was not uncommon to find her sitting on her bed watching Netflix on an iPod Touch while messaging with her friends on another device.
“She was a teenager; it almost would have been worrying if she didn’t,” Mr. Russell said. “How you split those things from normal behavior and maybe something of concern, I really don’t know that you can.”
In the days after Molly’s death, Mr. Russell said, the family struggled to understand what had gone wrong. She had been a bit downcast for parts of the past year, but had perked up of late. The family attributed the mood swings to normal adolescent behavior.
The night before she died, the family watched a reality-TV show together, and Molly asked Mr. Russell for help with a work-experience project. She was excited about tickets to see “Hamilton” and to play a lead part in an upcoming school play.
It was only when Mr. Russell sat down with the family computer that pieces began coming together. After he gained access to her Instagram account, he found a folder titled “Unimportant things” with dozens of troubling images and quotes. “Who would love a suicidal girl?” one said.
He gasped while reviewing Molly’s email inbox, where he found a note from Pinterest that arrived about two weeks after her death. “Depression Pins you may like,” it said.
In January 2019, Mr. Russell went public with Molly’s story. Outraged that his young daughter could view such bleak content so easily and convinced that it had played a role in her death, he sat for a TV interview with the BBC that resulted in front-page stories across British newsstands.
Mr. Russell, a television director, urged the coroner reviewing Molly’s case to go beyond what is often a formulaic process, and to explore the role of social media. Mr. Walker agreed after seeing a sample of Molly’s social media history.
That resulted in a yearslong effort to get access to Molly’s social media data. The family did not know her iPhone passcode, but the London police were able to bypass it to extract 30,000 pages of material. After a lengthy battle, Meta agreed to provide more than 16,000 pages from her Instagram, such a volume that it delayed the start of the inquest. Merry Varney, a lawyer with the Leigh Day law firm who worked on the case through a legal aid program, said it had taken more than 1,000 hours to review the content.
What they found was that Molly had lived something of a double life. While she was a regular teenager to family, friends and teachers, her existence online was much bleaker.
In the six months before Molly died, she shared, liked or saved 16,300 pieces of content on Instagram. About 2,100 of those posts, or about 12 per day, were related to suicide, self-harm and depression, according to data that Meta disclosed to her family. Many accounts she interacted with were dedicated to sharing only depressive and suicidal material, often using hashtags that linked to other explicit content.
Many posts glorified inner struggle, hiding emotional duress and telling others “I’m fine.” Molly went on binges of liking and saving graphic depictions of suicide and self-harm, once after 3 a.m., according to a timeline of her Instagram usage.
“It’s a ghetto of the online world that once you fall into it the algorithm means you can’t escape it and keeps recommending more content,” Mr. Russell said during testimony.
Molly did not talk about her struggles with family, but she sought solace from online influencers who posted regularly about sadness and suicide. From an anonymous Twitter account her family discovered later, Molly had reached out to at least one influencer about her despair — messages that never received a response.
Jud Hoffman, the head of community operations at Pinterest, said he “deeply regrets” that Molly had viewed explicit material that he would not want his own children to view. “I am sorry,” he said.
Meta acknowledged that Molly had seen some content that violated its policies, but defended its practices overall as a balance between free expression and safety. The company added new protections in 2019, after the family went public about Molly’s experience, including prohibiting graphic images of self-harm such as cutting, and providing links to resources for those looking at sad or depressive material.
Ms. Lagone, who has a background in public health and was hired by Meta in 2020, said that while she was sorry Molly had seen such distressing content it was important to give people space to express sadness openly as “a cry for help.”
After the final decision in the case was announced on Friday, Mr. Russell was still stewing about a comment made by Ms. Lagone during her testimony that some of the material viewed by Molly had been safe.
“If this demented trail of life-sucking content was safe,” he said, “my daughter Molly would probably still be alive.”
If you are having thoughts of suicide, the following organizations can help.
In Britain, contact Samaritans at 116-123 or email **@sa********.org. Calls are free and confidential. Or call Papyrus at +44 800 068 4141 (9 a.m. to midnight), or message Young Minds: Text YM to 85258. You can also find a list of additional resources on
In the United States, call or text 988 to reach the National Suicide Prevention Lifeline or go to for a list of additional resources.


About Author