Coroner Lists ‘Negative Effects Of Online Content’ As One Of The Causes Of A UK Teen’s Death (original) (raw)
from the yikes dept
So… this is a thing that happened. Adam Satariano reports for the New York Times:
The coroner overseeing the case, who in Britain is a judgelike figure with wide authority to investigate and officially determine a person’s cause of death, was far less circumspect. On Friday, he ruled that Instagram and other social media platforms had contributed to her death — perhaps the first time anywhere that internet companies have been legally blamed for a suicide.
“Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content,” said the coroner, Andrew Walker. Rather than officially classify her death a suicide, he said the internet “affected her mental health in a negative way and contributed to her death in a more than minimal way.”
This was the declaration entered as evidence in a UK court case revolving around the suicide of 14-year-old Molly Russell. Also entered as evidence was a stream of disturbing content pulled from the deceased teen’s accounts and mobile device — content that included videos, images, and content related to suicide, including a post copied almost verbatim by Russell in her suicide note.
The content Russell apparently viewed in the weeks leading up to her suicide was horrific.
Molly’s social media use included material so upsetting that one courtroom worker stepped out of the room to avoid viewing a series of Instagram videos depicting suicide. A child psychologist who was called as an expert witness said the material was so “disturbing” and “distressing” that it caused him to lose sleep for weeks.
All of this led to Meta executives being cross-examined and asked to explain how a 14-year-old could so easily access this content. Elizabeth Langone, Meta’s head of health and well-being policies, had no explanation.
As has been noted here repeatedly, content moderation at scale is impossible. What may appear to be easy access to disturbing content may be more a reflection of the user than the platform’s inability to curtail harmful content. And what may appear to be a callous disregard for users may be nothing more than a person slipping through the cracks of content moderation, allowing them to find the content that intrigues them despite efforts made by platforms to keep this content from surfacing unwelcomed on people’s feeds.
This declaration by the UK coroner is, unfortunately, largely performative. It doesn’t really say anything about the death other than what the coroner wants to say about it. And this coroner was pushed into pinning the death (at least partially) on social media by the 14-year-old’s parent, a television director with the apparent power to sway the outcome of the inquest — a process largely assumed to be a factual, rather than speculative, recounting of a person’s death.
Mr. Russell, a television director, urged the coroner reviewing Molly’s case to go beyond what is often a formulaic process, and to explore the role of social media. Mr. Walker agreed after seeing a sample of Molly’s social media history.
That resulted in a yearslong effort to get access to Molly’s social media data. The family did not know her iPhone passcode, but the London police were able to bypass it to extract 30,000 pages of material. After a lengthy battle, Meta agreed to provide more than 16,000 pages from her Instagram, such a volume that it delayed the start of the inquest. Merry Varney, a lawyer with the Leigh Day law firm who worked on the case through a legal aid program, said it had taken more than 1,000 hours to review the content.
What they found was that Molly had lived something of a double life. While she was a regular teenager to family, friends and teachers, her existence online was much bleaker.
From what’s seen here (and detailed in the New York Times article), Molly’s parents didn’t take a good look at her social media use until after she died by suicide. This is not to blame the parents for not taking a closer look sooner, but to point out how ridiculous it is for a coroner to deliver this sort of declaration, especially at the prompting of a grieving parent looking to find someone to blame for his daughter’s suicide.
If this coroner wants to list contributing factors on the public record — especially when involved in litigation — they should at least be consistent. They could have listed “lack of parental oversight,” “peer pressure,” and “unaddressed psychological issues” as contributing factors. This report is showboating intended to portray social media services as harmful and direct attention away from the teen’s desire to access “harmful content.”
And, truly, the role of the coroner is to find the physical causes of death. We go to dangerous places quickly when we start saying that this or that thing clearly caused someone to die by suicide. We don’t know. We can’t know. Even if someone were trained in psychology (not often the case with coroners) you still can’t ever truly say what makes a person take their own life. There are likely many reasons, and they may all contribute in their own ways. But in the end, it’s the person who makes the decision themselves, and only they know the real reasons.
As Mike has written in the past, when we officially put “blame” on parties over suicide, it actually creates very serious problems. It allows those who are considering suicide the power to destroy someone else’s life as well, by simply saying that they chose to end their life because of this or that person or company or whatever — whether or not there’s any truth to it.
I’m well aware social media services often value market growth and user activity over user health and safety, but performative inquests are not the way to alter platforms’ priorities. Instead, it provides a basis for bad faith litigation that seeks to hold platforms directly responsible for the actions of users.
This sort of litigation is already far too popular in the United States. Its popularity in the UK should be expected to rise immediately, especially given the lack of First Amendment protections or Section 230 immunity.
It’s understandable for parents to seek closure when their children die unexpectedly. But misusing a process that is supposed to be free of influence to create “official” declarations of contributory liability won’t make things better for social media users. All it will do is give them fewer options to connect with people that might be able to steer them away from self-harm.
Filed Under: blame, coroner report, intermediary liability, molly russell, suicide, uk
Companies: meta