Meta is being sued by the attorneys general of 33 separate US states over allegations that it intentionally created and launched features on its Facebook and Instagram social media platforms that “purposefully addict children and teens.”
“Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame,” New York state attorney general Letitia James said in a statement. “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem. Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable.”
The lawsuit was heavily redacted when it was originally filed in October, but a new “less-redacted” version shared by the state of California reveals some numbers that don’t look good for Meta. It alleges that in 2021, for instance, Meta received more than 402,000 reports of under-13 users on Instagram through the platform’s reporting process, but acted on fewer than 164,000 of them.
It also allegedly made active efforts to avoid acting on complaints about underage users: One internal email chain in 2018 referenced in the lawsuit talks about “coaching” parents in order to convince them to allow their children to remain on the platform, while another included a discussion about Meta’s failure to delete a 12-year-old girl’s four accounts despite complaints from the girl’s mother, which the lawsuit says were ignored because employees “couldn’t tell for sure the user was underage.”
The lawsuit says Meta’s business model is “based on maximizing the time that young users spend on its social media platforms,” and to that end it designed and deployed “psychologically manipulative” features designed to exploit them. At the same time, it promoted those features as specifically not being manipulative, and “routinely published profoundly misleading reports purporting to show impressively low rates of negative and harmful experiences” amongst its users.
It also allegedly “continued to conceal and downplay” research indicating a range of negative outcomes associated with the use of social media, including its own internal studies, which “reveal that Meta has known for years about the serious harms associated with young users’ time spent on its social media platforms.”
Naturally, allegations of widespread violations of the Children’s Online Privacy Protection Act (COPPA)—the one that cost Epic a whopping half-billion dollars in 2022—are also in the mix: “Meta has marketed and directed its Social Media Platforms to children under the age of 13 and has actual knowledge that those children use its Platforms. But Meta has refused to obtain (or even to attempt to obtain) the consent of those children’s parents prior to collecting and monetizing their personal data.”
In June, Meta posted an announcement about new parental supervision tools available in Messenger, which it expanded to Facebook, Instagram, and Horizon Worlds in November. “These tools allow parents to see how their teen uses Messenger, from how much time they’re spending on messaging to providing information about their teen’s message settings,” Meta said. These tools do not allow parents to read their teen’s messages.
“Over the next year, we’ll add more features to Parental Supervision on Messenger so parents can help their teens better manage their time and interactions, while still balancing their privacy as these tools function in both unencrypted and end-to-end encrypted chats.”
Meta lied about Instagram’s addictiveness.Meta lied about Instagram’s negative impact on young people’s mental health.Meta lied, and our children paid the price.I won’t stand for it. Not as a father, and not as California’s Attorney General. pic.twitter.com/axxpDTP7dLOctober 24, 2023
One of the features touted in that announcement is the “Take a Break” tool, rolled out to Instagram in 2021, which enables teen users to set themselves a reminder to stop scrolling and go do something else. But the lawsuit dismisses it out of hand, because “instead of being able to set it and forget it, young users who make what can be a difficult choice to limit their daily use or take a break must make this difficult decision over and over again. Meta’s design choices make the proverbial wagon that much easier for young users to fall off.”
“Meta knows that what it is doing is bad for kids — period,” California Attorney General Rob Bonta said. “Thanks to our unredacted federal complaint, it is now there in black and white, and it is damning. We will continue to vigorously prosecute this matter.”
The matter has yet to be tried before a court, but the lawsuit certainly does seem comprehensively assembled. The states involved are seeking a permanent injunction against “engaging in the acts and practices” on all Meta-owned social media platforms, as well as per-state fines, civil penalties, and legal costs, varying based on each state’s individual laws. All told, it could add up to a lot.
I’ve reached out to Meta for comment on the lawsuit and will update if I receive a reply.