Why the Case Matters
It’s unusual for so many states to come together to sue a tech giant for consumer harms. The coordination shows states are prioritizing the issue of children and online safety and combining legal resources to fight Meta, just as states had previously done for cases against Big Tobacco and Big Pharma companies.
“Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us,” Phil Weiser, Colorado’s attorney general, said in a statement.
Lawmakers around the globe have been trying to rein in platforms like Instagram and TikTok on behalf of children. Over the past few years, Britain, followed by states like California and Utah, passed laws to require social media platforms to boost privacy and safety protections for minors online. The Utah law, among other things, would require social media apps to turn off notifications by default for minors overnight to reduce interruptions to children’s sleep.
Regulators have also tried to hold social media companies accountable for possible harms to young people. Last year, a coroner in Britain ruled that Instagram had contributed to the death of a teenager who took her own life after seeing thousands of images of self-harm on the platform.
Laws to protect the safety of children online in the United States, however, have stalled in Congress as tech companies lobby against them.
“We’ve been warning about Meta’s manipulation and harming of young people from its start and sadly it has taken years to hold it and other companies like Google accountable,” said Jeffrey Chester, the executive director of consumer advocacy at the Center for Digital Democracy. “Hopefully justice will be served but this is why it’s so crucial to have regulations.”
How the Investigation Started
States began investigating Instagram’s potentially harmful effects on young people several years ago as public concerns over cyberbullying and teen mental health mounted.
In early 2021, Facebook announced that it was planning to develop “Instagram Kids,” a version of its popular app that would be aimed at users younger than 13. The news prompted a backlash among concerned lawmakers and children’s groups.
Soon after, a group of attorneys general from more than 40 states wrote a letter to Mark Zuckerberg, the company’s chief executive. In it, they said that Facebook had “historically failed to protect the welfare of children on its platforms” and urged the company to abandon its plans for Instagram Kids.
Concerns among the attorneys general intensified in September 2021 after Frances Haugen, a former Facebook employee, leaked company research indicating that the company knew its platforms posed mental health risks to young people. Facebook then announced it was pausing the development of Instagram Kids.
That November, a bipartisan group of attorneys general, including Colorado, Massachusetts and New Hampshire, announced a joint investigation into Instagram’s impact — and potential harmful effects — on young people.
Remedies
Under local and state consumer protection laws, the attorneys general are seeking financial penalties from Meta. The District of Columbia and the states are also asking the court for injunctive relief to force the company to stop using certain tech features that the states contend have harmed young users.
What Happens Next
Meta is expected to fight to dismiss the case. Mr. Weiser, the Colorado attorney general, said in a news conference that he filed the lawsuit because he wasn’t able to reach a settlement with the company. He noted that Meta had filed a motion to dismiss a separate lawsuit filed by consumers, which accuses the company of similar allegations of harms toward children and teenagers.
Separately, a group of attorneys general from more than 40 states is pursuing an investigation into user engagement practices at TikTok and their possible harmful effects on young people. That investigation, which was announced in 2022, is ongoing.