Family Blames Instagram for Teen’s Eating Disorder in Lawsuit Citing Facebook Papers

Family Blames Instagram for Teen’s Eating Disorder in Lawsuit Citing Facebook Papers

Warning: This article discusses mental health and eating disorders that some readers may find distressing. If you or someone you know is experiencing thoughts of suicide or self-harm, please call 000 in an emergency or Helpine on 13 11 14. You can also speak to the Butterfly Foundation on 1800 334 673

A personal injury lawsuit filed in California federal court on Monday alleges Instagram’s parent company Meta purposely crafted products to addict young users, steering one 11-year-old girl down a years-long path of physical and psychological harm.

The case, brought by the Social Media Victims Law Centre on behalf of now-19-year-old Alexis Spence, asserts Instagram “consistently and knowingly” targeted its product at young children while at the same time ignoring warnings internally about its worsening effects on the mental health of its users.

“As a result of Alexis’ addiction to Instagram, she had to undergo professional counseling, in-patient programs, out-patient programs, participate in eating disorder programs and will likely require help in the form of a service dog for the rest of her life, as well as ongoing medical attention to ensure she does not digress,” the lawyers said.

Their suit hinges on and directly cites the Facebook Papers, the trove of internal files leaked last fall by Facebook whistleblower Frances Haugen. Among them were confidential reports and presentations portraying Instagram as a blight on the mental health of adolescents. The lawsuit is among the first to use the documents against Meta in actual court rather than the court of public opinion.

The suit is also the latest in a wave of new cases around the country hoping to find a way around the liability shield extended to website owners and operators under Section 230 of the Communications Decency Act. Passed in 1996, Section 230 is considered foundational to the internet as exists today, enabling large tech companies and everyday users to moderate their own websites absent the fear of being buried in lawsuits over content posted by third parties.

“There’s a concerted effort across the country to re-frame lawsuits against internet services as attacking their software tools, not attacking the content that’s published using them,” Eric Goldman, a law professor at Santa Clara University, told Gizmodo by phone.

Meta declined to comment on the Spence case, citing active litigation, but a spokesperson pointed Gizmodo to a range of features they said are designed to help people struggling with body image issues.

Spence joined Instagram in the fifth grade, two years too young to join the app under its minimum age requirement, according to the suit. The complaint alleges her addiction to Instagram was the result of deliberate efforts by the company to design a product that is inherently addicting and has toothless support features.

Congress has already taken notice of the suit. Rep. Tom Malinowski, a New Jersey Democrat who co-authored a legislation last year targeting social media algorithms that neglect to prevent interference with civil rights, said the Spence case was just the latest to exhibit the “real-world harms caused by sophisticated algorithms designed to keep us glued to our screens — eating disorders, suicides, mass shootings, insurrections.”

Soon after she joined, according to the complaint, Instagram’s algorithm rapidly drove the younger Spence toward an endless stream of problematic content; a deluge of underweight models and links to extreme dieting websites promoting “anorexia, negative body image and self-harm.” This eventually spurred an eating disorder, lawyers said, preceding years of anxiety, depression, and suicidal ideation. Hospitalisation was inevitably necessary.

The Facebook Papers, which Gizmodo began making public for the first time in April, revealed to the world that Meta knew how some of its users felt about their use of social media: badly. A leaked survey conducted by the company found that many teens blamed Instagram directly for anxiety and depressive episodes. These self-diagnoses were reported “unprompted,” the company said, and were “consistent across all groups.”

“Thirty-two per cent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” read one leaked presentation from March 2020.

Facebook responded to the leak at the time by downplaying the negative effects, which it described as “quite small” in scale. Executives noted that the leaked research was not considered scientific by any academic standard. The bulk of it was conducted by marketing experts, and the ill effects were based on subjective, retrospective analysis self-reported by users. At a congressional hearing last year, Meta CEO Mark Zuckerberg pointed to contrary research linking social media usage with positive mental-health benefits. Similar surveys conducted by the Pew Research Centre in 2018, for example, found that teens were more likely to associate social media apps with positive emotions — though the sentiment was “far from unanimous.”

At the same time, documents showed that Meta — known simply as Facebook at the time — had endeavoured to ingratiate its brand with users years too young to actually use its services. Marketing research, never intended for public consumption, coldly portrayed children as young as 10 years old as a “valuable” and “untapped” resource pivotal to the company’s growth.

The immunity granted under Section 230 is not absolute, as a federal court’s recent ruling shows. About a year ago, the Ninth Circuit determined there was a sufficient enough distinction between claims over content published by a website — which would be covered under Section 230 — and allegations of negligence concerning the underlying software of a particular social network’s design. Spence’s case against Instagram appears to be another attempt to build off those efforts.

Goldman, a staunch supporter of Section 230 who believes efforts to amend it are largely politically motivated, was also critical of the Ninth Circuit ruling.

That case, which remains unsettled, centres around the “speed filter” feature removed by Snapchat last year. The filter was designed to allow users to capture how fast they were moving, but allegedly led some users to pursue bragging rights by driving recklessly at excessive speeds. Several deaths between 2015 and 2017 have been attributed to the filter, including those of three young men in Wisconsin who fatally crashed into a tree. The app clocked their car moving at 198 km per hour shortly before the crash.

The Ninth Circuit held that Section 230 was not an applicable defence so long as the plaintiffs’ remain focused on Snapchat’s software design, and not the content shared on its platform. Similarly, Spence’s lawyers appear eager to the keep the focus on the addictive nature of Instagram’s product as opposed to the content she may have encountered.

“Again, you’re talking about the algorithm and the way that the complaint may be framed is really more about the overall service, that everything about the service was designed to encourage usage and that encouraged amount of usage is what caused the problem,” Goldman said. “It doesn’t mean they’ll win, but they may have found a way to get around Section 230.”

It’s difficult to separate a product that promotes content from the content it promotes, Goldman said, even if the courts find a legally significant distinction. “I understand at the big structural level, yes, they’re not trying to assert that they’re suing for third-party content, they’re suing based on the software design,” he said. “But to me those collapse together in every material way.”

Malinowski’s bill, the Protecting Americans from Dangerous Algorithms Act, was one of numerous last year aimed at amending Section 230. Whether due to the sheer complexity of the issue, the potential ramifications for the global internet economy, or partisan disagreements over the approach, none of the bills took off.

“Large social media platforms should not enjoy blanket legal immunity for harmful content that they actively amplify, promote, or recommend to their users,” he said.

When users encounter content on Instagram related to self-harm or eating disorders, the app is supposed to flag “potentially triggering images” and blur them out, while also pointing users to hotline resources offered by groups such as the National Eating Disorders Association.

Additional relevant featured are also in place, Meta said, including age verification, restricted DMs between adults and teens, and default privacy settings for users under age of 16.

If you or someone you know is experiencing thoughts of suicide or self-harm, please call 000 in an emergency or Helpine on 13 11 14. You can also speak to the Butterfly Foundation on 1800 334 673.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.