Court finds Meta, Pinterest culpable in suicide of U.K. teen Molly Russell

Reading Time: 4 minutes

The 14-year-old had viewed over 2,000 posts related to suicide and self-harm in the months before her death.

One of Molly’s two older sisters, her father, and her mother outside the coroner’s court in North London on the first day of the inquest into her death.
Credit: Kirsty O’Connor/PA Images via Getty Images

Meta, Pinterest, and other social media platforms are legally to blame for the death of 14-year-old Molly Russell, according to the senior coroner at the coroner’s court of North London. The ruling concerning Russell, who died as a result of self-harm in November 2017, came on Friday, Sept. 30.

A British coroner is a figure with broad authority to investigate and determine a person’s cause of death. This was not a criminal or civil trial, and Pinterest and Meta do not face penalties as a result. Russell’s family pursued the case against the two tech giants to raise awareness of the dangers of social media content accessible to young people.

Russell created an Instagram at age 12 with her parent’s permission and received an iPhone as a 13th birthday present. Before her death, Russell’s parents say she had been acting like a normal teenager: listening to pop music, watching Netflix, and texting friends. She was excited to be in an upcoming school play. Some gloomier moments earlier in the year had not rung any alarm bells, and could be chalked up to normal adolescent mood swings.

But two weeks after Russell died, her father found an email from Pinterest called “Depression Pins you may like.” On Russell’s Instagram account he found a folder titled “Unimportant things” with dozens of unsettling images and quotes, including one that read “Who would love a suicidal girl?”

Mr. Russell went public with his daughter’s story in January 2019 in an interview with the BBC. Meta eventually agreed to provide more than 16,000 pages from Molly Russell’s Instagram, which took more than 1,000 hours for the family’s legal team to review before being presented in court. About 2,100 of those posts were related to suicide, self-harm and depression, according to data that Meta disclosed to her family. Many of those posts used hashtags that linked to other explicit content and encouraged hiding mental emotional distress.

The New York Times reports that the material was so disturbing that a courtroom worker left the room to avoid viewing a set of Instagram videos depicting suicide. A child psychologist who served as an expert witness said that reviewing the material that Russell viewed was so “disturbing” and “distressing” that he lost sleep for weeks.

Elizabeth Lagone, Meta’s head of health and well-being policy, appeared in court. Lagone was read a post that Russell had seen on Instagram, then heard how she had copied its wording in a self-loathing note later found by her parents. “This is Instagram literally giving Molly ideas,” said Oliver Sanders, a lawyer representing the family.

Ms. Lagone said she regretted that Russell had seen such distressing content but that it was important for the platform to allow users to express unhappy feelings openly as “a cry for help.” Jud Hoffman, the head of community operations at Pinterest, said his platform was “not safe” during the time Molly was on it and that he “deeply regrets” and apologizes for the material Molly had viewed. For its part, Pinterest has invested significant resources in culling and suppressing harmful content since 2018.

A draft law called the “Online Safety Bill” is currently making its way through Britain’s Parliament and is partially inspired by Russell and her family. It calls for a “new duty of care” on the part of online platforms towards their users, and lays out fines for sites unable to remove harmful content.

“Anyone who knew Molly was looking forward to the way that she would grow up, to the person she would become,” her father told the BBC. “She had so much to offer, and that’s gone…with the help of the internet and social media.”

If you’re feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can reach the 988 Suicide and Crisis Lifeline at 988; the Trans Lifeline at 877-565-8860; or the Trevor Project at 866-488-7386. Text “START” to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don’t like the phone, consider using the 988 Suicide and Crisis Lifeline Chat at crisischat.org. Here is a list of international resources.

By signing up to the Mashable newsletter you agree to receive electronic communications
from Mashable that may sometimes include advertisements or sponsored content.

Article Source




Information contained on this page is provided by an independent third-party content provider. This website makes no warranties or representations in connection therewith. If you are affiliated with this page and would like it removed please contact editor @saltlakecityutah.business

Warning! This link is a trap for bad bots! Do not follow this link or you're IP adress will be banned from the site!