Warning: This article deals with the topic of mental health, and mentions suicide. It may be triggering for some. If you or someone you love is in need of support, Lifeline (13 11 14) is available 24/7, free of charge. If an emergency, call 000.
Snapchat and Meta “knowingly and purposely” created harmful, addictive products that led to a 17-year-old boy’s tragic suicide, a new lawsuit alleges.
The suit, shared with Gizmodo and filed on behalf of Wisconsin teen Christopher J. Dawley by the Social Media Victim Law Centre, seeks to hold the two companies accountable for contributing to what it describes as a “burgeoning mental health crisis” in children and teenagers in the U.S. The suit claims Dawley’s January 2015 death by suicide was caused, in part, by his addiction to the “unreasonably dangerous and defective social media products” created by Meta, the parent company of Instagram and Facebook, and Snap Inc, owner of Snapchat.
Snap did not immediately respond to Gizmodo’s request for comment. A Meta spokesperson said they recognise social media can have positive and negative outcomes but said they believed the ways in which people spend their time on a platform are what matters the most compared to sheer amount of time online. The spokesperson wouldn’t comment directly on the lawsuit but pointed Gizmodo to a broad list of tools and resources Meta has put in place over the years to reduce the visibility of potentially harmful content and assist users experiencing mental health difficulties.
This week’s legal action marks the second wrongful death lawsuit filed against the two companies in less than six months and comes amid increased lawmaker scrutiny over the ways social media platforms can exacerbate teen depression and other emotional problems. A similar complaint was lodged against Meta and Snap back in January over the companies’ alleged role in an 11-year-old- girl’s suicide. In that case, (which was also filed by the SMVLC) the young girl named Selena Rodriguez reportedly suffered from “extreme addiction to instagram and Snapchat ‘’ for more than two years. An outpatient therapist referenced in that complaint had allegedly “never seen a patient as addicted to social media.”
In fact, internal Meta documents leaked last year by whistleblower Francis Haugen revealed Meta officials were aware of teen mental-health related harms caused by its platforms even as it was taking steps to develop an Instagram for Kids service prioritising youth engagement.
“They [teens] often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves,” an Instagram research manager said, according to the documents. Following backlash from just about everyone, Instagram finally took its foot off the gas late last year and announced it would “pause” the initiative.
“Congressional testimony has shown that both Meta Platforms and Snapchat were aware of the addictive nature of their products and failed to protect minors in the name of more clicks and additional revenue,” SMVLC founder Matthew P. Bergman said in a statement. “We are calling on the parent companies of Facebook, Instagram and Snapchat to prioritise the health and wellness of its users by implementing safeguards to protect minors from the danger of cyberbullying and sexual exploitation that run rampant on their platforms.”
Questions pertaining to Meta and Snap’s specific products are central to both complaints. The SMVLC cites independent studies, as well internal research from the companies themselves, which the SMVLC believed draws connecting lines between use of its products and increased youth depression, amongst other potential harms. According to the suit, Meta and Snap’s social media products are uniquely harmful to mental health due to their design.
“It is technologically feasible to design social media products that substantially decrease both the incidence and magnitude of harm to ordinary consumers and minors,” the suit claims. Moreover, the SMVLC argues Meta and Snap didn’t provide adequate warnings to minors about potentially harmful effects from using their products.
Growing links between youth harm and social media
It doesn’t take an expert to know that too much time doom scrolling on social media can leave you feeling helpless. Still, a growing body of research in recent years has begun to quantify what many users already felt was anecdotally true. According to a 2019 article published in The Lancet, frequent social media use predicted lower levels of well being in English youth between the ages of 13-16. Those findings were particularly pronounced amongst young girls. A more recent study from the University of Georgia suggested more time spent on social media may be linked to increased cyberbullying. Speaking to the allegedly addictive quality of social media platforms, a working paper shared with The Washington Post last year concluded that around 31% of social media use was the result of “self control problems.”
And while most social media platforms officially claim their services aren’t available to children under the age of 13, anyone with a younger family member will tell a different story. Around half of parents with children between 10-12 surveyed by the University of Michigan’ Health C.S. Mott Children’s Hospital claimed their kids had used social media in the past six months. It’s worth noting though there is a vocal group of detractors who maintain the word “addiction” shouldn’t be applied to social media or digital services since it may not result in the same physical effects as addiction to substances.
Lawmakers and regulators are stepping up scrutiny
Regardless of where you fall on whether social media causes addiction as it’s traditionally known, one thing’s clear: regulators and lawmakers have taken notice. Late last year, the U.S. Surgeon General issued an advisory expressing an “urgent need to address the nation’s youth mental health crisis.” Though the advisory doesn’t claim social media’s the sole driver of worsening youth mental health, it does call for more research on the relationship between technology and youth mental health and encourages tech companies to increase transparency around its algorithms.
Other legislative efforts don’t beat around the bush when it comes to the addiction question. In February, a bipartisan pair of senators proposed new legislation, dubbed the Social Media NUDGE Act, that would require the National Academies of Sciences, Engineering, and Medicine to study and implement interventions that could “reduce the harm of algorithmic amplification and social media addiction.” Then, on the state level, a bipartisan pair of lawmakers have proposed a bill called the Social Media Platform Duty to Children Act which would let individual parents sue platforms for allegedly addicting their children. If passed, the California bill would let parents sue companies regardless of whether or not the company intentionally designed their service to be addictive.
If you or someone you love is in need of support, Lifeline (13 11 14) is available 24/7, free of charge. If an emergency, call 000.