Meta sued by 42 attorneys general for addictive features targeting kids
- A bipartisan group of 42 attorneys general is suing Meta over addictive features aimed at kids and teens.
- The lawsuits represent the broad bipartisan interest in protecting kids and teens from online harm.
- Meta designed its Facebook and Instagram products to keep young users on them for longer and repeatedly coming back, the AGs allege.
A bipartisan group of 42 attorneys general is suing Meta
over addictive features aimed at kids and teens, the AGs announced Tuesday. The support from so many state AGs of different political backgrounds indicates a significant legal challenge to Meta’s business.
Meta is now facing multiple lawsuits on this issue in several districts. AGs from 33 states filed a federal suit against Meta in the Northern District of California, while 9 additional AGs are filing in their own states, according to a press release from New York Attorney General Letitia James’ office.
The lawsuits are another demonstration of the bipartisan priority state law enforcers’ have placed on protecting kids and teens from online harm.
It’s also not the first time a broad coalition of state AGs have teamed up to go after Meta. In 2020, 48 states and territories sued the company on antitrust grounds, alongside a separate complaint from the Federal Trade Commission.
Meta designed its Facebook and Instagram products to keep young users on them for longer and repeatedly coming back, the AGs allege. According to the federal complaint, Meta did this via the design of its algorithms, copious alerts, notifications and so-called infinite scroll through platform feeds. The company also includes features that the AGs allege negatively impact teens’ mental health through social comparison or promoting body dysmorphia, such as “likes” or photo filters.
The federal suit also accuses Meta of violating the Children’s Online Privacy Protection Act (COPPA) by collecting personal data on users under 13 without parental consent.
The states are seeking an end to what they see as Meta’s harmful practices, as well as penalties and restitution.
Meta was well aware of the negative effects its design could have on its young users, the AGs allege.
“While Meta has publicly denied and downplayed these harmful effects, it cannot credibly plead ignorance,” James’ office wrote in a press release. “Meta’s own internal research documents show its awareness that its products harm young users. Indeed, internal studies that Meta commissioned – and kept private until they were leaked by a whistleblower and publicly reported – reveal that Meta has known for years about these serious harms associated with young users’ time spent on its platforms.”
Former Facebook employee Frances Haugen caused an uproar among lawmakers and parents in 2021 after leaking internal documents from the company that revealed internal research on its products. One set of documents about Instagram’s impact on teens found that “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” The Wall Street Journal reported before Haugen made her identity known. Following the report, Instagram said it was working on ways to pull users away from dwelling on negative topics.
Several of the practices the AGs focus on for Meta are similar to those exercised by other social media businesses, such as designing algorithms to keep users engaged.
“We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families,” Meta spokesperson Andy Stone said in a statement. “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”
Among the states that filed the federal suit are California, Colorado, Louisiana, Nebraska, New York, South Carolina, Washington and Wisconsin.
It’s a rare issue that can bring 41 states together for a bipartisan fight.
Last week, state attorneys general across the political spectrum joined forces in suing Facebook parent company Meta for allegedly using features on Instagram and other platforms that hook young users, while denying or downplaying the risks to their mental health.
This comes two years after states began investigating Meta following revelations that the company’s internal research found Instagram was having a negative effect on some teen users’ mental health. Since then health professionals, including Surgeon General Dr. Vivek Murthy and the American Psychological Association, have urged tech companies to make their products safer for young people.
But there hasn’t yet been significant change in the industry. Most companies haven’t been willing to overhaul their platforms to curb addictive features or harmful content for users under 18 years old, such as setting time limits on their apps or changing algorithms that steer kids into “rabbit holes” to keep them online. Nor have federal laUnsealed Docs Show Meta Aimed To Hook Kidswmakers been able to enact comprehensive product safety regulations because legislation has stalled in Congress or been blocked by courts.
In the absence of policy changes, lawsuits are the next logical step in prodding technology companies to ensure their products are safe for young people or be held accountable. Some have compared the states’ legal strategy to lawsuits against Big Tobacco and opioid manufacturers that revealed how the companies lied about the harm caused by their products, and forced them to change their business practices.
Meta is the first target because of the 2021 revelations, but the state attorneys general said this is an industrywide investigation. They have also begun looking into TikTok.
The federal complaint alleges Meta used harmful and “psychologically manipulative product features,” such as “likes,” infinite scroll and constant alerts, to hook young people on Instagram and Facebook and keep them engaged for as much time as possible in order to boost profits. Despite knowing that young users’ brains are particularly vulnerable to manipulation by such features and internal studies warning that kids were being harmed, Meta allegedly concealed, denied and downplayed the harms.
The lawsuit, which was filed jointly by 33 states, including California, also accused Meta of violating the Children’s Online Privacy Protection Act, a federal law that protects the digital privacy of children under 13 years old. Eight states and the District of Columbia filed separate lawsuits in state or federal courts, many alleging that Meta violated state consumer protection laws.
Meta said in a statement that it has already rolled out 30 tools to support teens on its apps since 2021, including reminders on Instagram for teens to take a break and sharing expert resources if kids search for posts on suicide or eating disorders. That’s a good start. The company lamented that the states chose to sue rather than work with tech firms “across the industry to create clear, age-appropriate standards.”
Indeed, there is a need for comprehensive safety standards across social media platforms. But a tech lobbying group of which Meta is a member has sued to stop an effort by California, which passed a first-in-the-nation law last year requiring age-appropriate design and child privacy protection. The law was recently put on hold by a federal judge citing First Amendment concerns. California Attorney General Rob Bonta has filed an appeal.
This is complex legal and regulatory terrain, and the states’ lawsuits are not a sure bet given existing laws that protect online platform companies from being held liable for content posted by users on their sites. Nor will any of these cases be resolved quickly. That’s OK. This is an essential fight for the future.
Written by the Los Angeles Times editorial board. Distributed by Tribune Content Agency, LLC.