Meta Platforms Inc. and Snap Inc. are to blame for the suicide of an 11-year-old who was addicted to Instagram and Snapchat, the girl’s mother alleged in a lawsuit.
The woman claims her daughter Selena Rodriguez struggled for two years with an “extreme addiction” to Meta’s photo-sharing platform and Snap’s messaging app before taking her life last year.
The complaint in San Francisco federal court isn’t the first lawsuit to blame a youth’s suicide on social media, but it comes at a sensitive time for platforms that engage millions of young people worldwide.
In November, a group of U.S. state attorneys general announced an investigation of Instagram over its efforts to draw children and young adults, taking aim at the risks the social network may pose to their mental health and well-being. The states’ probe was launched after a former Facebook employee turned whistle-blower testified in Congress that the company knew about, but didn’t disclose, harmful impacts of its services like Instagram.
The backlash against social media isn’t limited to the U.S. The father of a 14-year-old in the U.K. touched off a firestorm when he blamed her 2017 suicide partly on Instagram. The company told the BBC that it doesn’t allow content that promotes self-harm.
“We are devastated to hear of Selena’s passing and our hearts go out to her family,” a Snap spokesperson said Friday in an emailed statement. “While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community.”
Meta and Snap knew or should have known that “their social media products were harmful to a significant percentage of their minor users,” according to Thursday’s lawsuit. “In other words, defendants intentionally created an attractive nuisance to young children, but failed to provide adequate safeguards from the harmful effects they knew were occurring on their wholly owned and controlled digital premises.”
Meta representatives didn’t respond to an email seeking comment.
A Meta spokesperson said in November that allegations the company puts profit over safety are false and that “we continue to build new features to help people who might be dealing with negative social comparisons or body image issues.”
Snap said in May it was suspending projects with two app makers “out of an abundance of caution for the safety of the Snapchat community” in light of a wrongful-death and class-action suit filed in California that accused the companies of failing to enforce their own policies against cyber-bullying.
Tammy Rodriguez, who lives in Connecticut, said when she tried to limit her daughter’s access to the platforms, the girl ran away from home. She took her daughter to a therapist who said “she had never seen a patient as addicted to social media as Selena,” according to the suit.
The lawsuit levels its harshest criticism at Snapchat, saying the platform rewards users in “excessive and dangerous ways” for engagement. The mother alleges claims of product defect, negligence and violations of California’s consumer protection law. One of the lawyers on the case is from Social Media Victims Law Center, a Seattle-based legal advocacy group.
“Snapchat helps people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it hard for strangers to contact young people,” the Snap spokesperson said. “We work closely with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe.”
Social media companies have been largely successful fending off lawsuits blaming them for personal injuries thanks to a 1996 federal law that shields internet platforms from liability for what users post online.
The case is Rodriguez v. Meta Platforms Inc. f/k/a Facebook Inc. 3:22-cv-00401, U.S. District Court, Northern District of California (San Francisco).
(Updates with Snap comment.)
–With assistance from Naomi Nix.