In a complaint released on Tuesday, the attorneys general of 33 states, including California and New York, said Meta, the company that runs Facebook and Instagram, lied to the public many times about how dangerous its platforms were and knowingly got kids and teens to become addicted to and unable to stop using social media.
The lawsuit in the federal court in Oakland, California, says that Meta has used “powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens.” “Its motive is profit.”
Businesses have long been interested in children as a target group because they hope to get them to buy their products at a time when they are more open to being influenced and to make them loyal to their brands.
More advertisers may want to work with Meta if they can get kids to buy their goods even when they get older.
But the states said that studies have linked kids using Meta’s social media sites to “depression, anxiety, insomnia, problems in school and daily life, and many other bad outcomes.”
Meta said it was “disappointed” by the case.
“Instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the solicitors general have chosen this path,” the firm said.
On Tuesday, similar cases were filed against Meta by eight more U.S. states and Washington, D.C. This brings the total number of governments taking action against the Menlo Park, California-based company to 42.
On the Nasdaq, Meta shares went down 0.6%.
TikTok and YouTube are already being sued
These cases are the most recent ones in a long line of lawsuits brought by kids and adults against social media companies.
Meta, ByteDance’s TikTok, and Google’s YouTube are already being sued by hundreds of kids and school districts who say that social media is too addicting.
Meta’s CEO, Mark Zuckerberg, has defended how his business handles content that some critics say is harmful in the past.
“At the heart of these accusations is the idea that we care more about making money than keeping people safe and healthy.” “That’s not true at all,” he wrote on his Facebook page in October 2021.
In the cases that happened on Tuesday, Meta could be fined between $1,000 and $50,000 for breaking different state rules. With so many kids and teens using Instagram, this amount could add up quickly.
A leak of papers by a company employee in 2021, which showed that Instagram, which started out as a photo-sharing app, was known to be addicting and make body image problems worse for some teenage girls, brought a lot of attention to Meta.
The 33 states that filed the case said that Meta has tried to get young people to spend as much time as possible on social media, even though Meta knows that these people need approval from other users in the form of “likes” on their posts.
“Meta has been hurting our kids and teens by encouraging addiction to make money for the company,” said Rob Bonta, the attorney general of California, which is home to Meta’s offices.
“THREATS WE CAN’T IGNORE”
States also said Meta broke a law that says companies can’t collect information about kids younger than 13 and lied when they said their social media wasn’t dangerous.
“Meta did not disclose that its algorithms were designed to capitalise on young users’ dopamine responses and create an addictive cycle of engagement,” the complaint stated.
There is a neurotransmitter called dopamine that helps us feel good.
According to the complaint, Meta refused to take responsibility for a 14-year-old girl’s suicide in the UK last year after she saw material on Instagram about suicide and self-harm. Meta also distanced itself from the suicide.
A Meta executive said that the content was “safe” for kids, but the coroner didn’t believe them. They thought that the girl probably binged on harmful content that made her sadness seem normal before she killed herself.
States also said Meta was trying to bring its bad behaviour into virtual reality through its Horizon Worlds platform and the WhatsApp and Messenger apps.
By suing, the government is trying to fill in the gaps left by the U.S. Congress’s inability to pass new online safety rules for kids, even after years of talks.
Philip Weiser, the attorney general of Colorado, said that the leak by the tipster showed that Meta knew that Facebook and Instagram were hurting kids.
“It is very clear that decisions made by social media platforms, like Meta, are part of what is driving mental health harms, physical health harms, and threats that we can’t ignore,” he stated.