The trial in Meta reached a new stage this Tuesday. The attorneys general of 41 states sued Meta, the parent company of Facebook, Instagram, WhatsApp or Messenger, for developing products deliberately aimed at engaging children, even though the company has stated that its social networks are safe for minors. This lawsuit adds to a cascade of lawsuits, two hundred of which are grouped in a class action lawsuit, filed in April by individuals and educational institutions in the country against several social networks (Facebook and Instagram, by Meta, but also Snapchat, TikTok and YouTube). affecting the mental health of young people.
The coordinated action of these 41 states, 33 of which are brought together in a single joint lawsuit, explains in more than 200 pages the reasons why Instagram (and to a lesser extent Facebook) is a harmful product for young people. In the text, the attorneys general explain that they are filing the lawsuit “to defend the public interest and to avoid and prevent adverse effects of corporate practices in their states.”
“Meta has exploited the pain of children by intentionally equipping their platforms with features that manipulate them and make them dependent on their platforms while weakening their self-esteem,” New York Attorney General Letitia James said in a statement on Tuesday at an appearance afterwards, filing the collective action lawsuit. “Social networks, including meta, have contributed to a mental health crisis among young people and must be held accountable,” he said. According to the Centers for Disease Control and Prevention, nearly a third of American teenage girls had suicidal thoughts in 2021, a 60% increase from the previous decade. “Just as tobacco companies or e-cigarettes have done in the past, Meta has chosen to maximize its benefits at the expense of public health, particularly harming the youngest,” Colorado Attorney General Phil Wieser said in a statement.
More information
A Meta spokesperson regretted that “this path” (the legal one) was chosen instead of “working productively with the industry to create usage standards for underage apps.” There is also resentment within the company because the lawsuit filed this week focuses only on them and leaves out the rest of the social networks.
The blow that Meta suffered this week did not come without warning. The origin of this process dates back to early 2021, when the company announced that it planned to create Instagram Kids, a version of its popular social network aimed at children under 13. The announcement caused a stir in the USA. Several civil associations protested publicly and a group of 44 state attorneys general sent an open letter to Meta CEO Mark Zuckerberg urging him to reconsider the idea.
In September of the same year, Frances Haugen’s leaks gave reasons to those who were suspicious of Meta. The former Facebook employee revealed internal documents to the Wall Street Journal showing that the company’s executives knew about the harmful effects Instagram was having on young people, particularly adolescent girls. Despite its own reports claiming that Instagram caused eating disorders and even drove some users to suicide, the social network’s leaders did nothing to reverse the situation.
The strength of this data led three states to launch a formal investigation in November to examine Instagram’s potential negative impact on young people. Haugen’s revelations also led to dozens of parents filing their own lawsuits against Meta, claiming that Meta had impaired the health or even physical integrity of their children. This process culminated in the class action lawsuit in April of this year, in which several educational institutions took part, added. “Our case is not just focused on the content of the platforms: we are referring to the design of social networks themselves, which are designed to be addictive,” Joseph VanZandt, the lawyer who coordinated the class action, told EL PAÍS. A hearing is scheduled for this Friday in the Northern District of California that could be crucial in this further process.
Former Facebook employee Frances Haugen, on October 5, 2021, after her appearance before the Senate. Drew Angerer (AP)
In 2022, given the emergence of TikTok, the social network with the greatest growth among young people, a group of attorneys general from more than 40 states launched another separate investigation into the Chinese-origin platform’s possible harmful effects on young people. They have not yet presented their conclusions.
In recent years, countries such as the United Kingdom and states such as California and Utah have passed regulations requiring greater privacy and security for children on social networks. In the case of Utah, use is restricted at night. In London last year, Instagram and Pinterest were accused of causing the death of a teenager who committed suicide after long exposure to those platforms.
Change of strategy
The lawsuits filed by 41 states against Meta reflect a shift in strategy toward class actions or joint actions brought by educational institutions and individuals. “While the latter emphasized that social networks have a negative impact on the mental health of young people, the lawsuit filed this week refers to local commercial and consumer laws as well as federal laws protecting the privacy and personal data of minors,” reflects Rodrigo Cetina, Professor of Law at the Barcelona School of Management, the Pompeu Fabra University Business School and expert in American law.
Although the document describes that the products (social networks) can manipulate people and that they are addictive in nature, it argues that this is not compatible with commercial laws. “The heavy fines imposed in the United States for privacy attacks are based less on privacy as a fundamental right and more on a violation of the right to consumer protection,” explains Cetina.
The change of course is no coincidence. In May of this year, when the educational institutions’ class action lawsuit had already been filed, two important rulings were published that freed the platforms from responsibility for the content they distribute. That prompted state attorneys general to shift the focus from something intangible, such as the effects a social network can have on the mind, to something more tangible: the fact that the company deceived children and parents by telling them that it would not affect you and that your data is protected.
Attorney General Rob Bonta speaks during a news conference in San Francisco on Tuesday, Oct. 24, 2023. Dozens of U.S. states, including California and New York, are using Meta Platforms Inc. to harm young people and contribute to the mental health crisis through deliberate and intentionally designing features on Instagram and Facebook that make children addicted to their platforms. (AP Photo/Haven Daley)Haven Daley (AP)
Likewise, the focus has been placed on a single company, Meta, presumably because it will be the one with the most solid incriminating evidence or information, to which Frances Haugen contributed significantly. Cetina ventures a second compelling reason: “The parts of Scripture [que figuran] The covered cases, which are confidential, suggest that the plaintiffs had the help of a protected witness with information about the case that could jeopardize meta.” That means there could be a new Haugen.
Problematic aspects
The lawsuit points out several points that make the social networks Instagram and Facebook illegal and dangerous for children. These are the most important:
Consumer deception. The central argument is based on deception of the consumer, a “disregard for the well-being” of the consumer and the “physical and mental health of the minor user” and the “intentional and knowing violation of consumer protection laws”. Consumer protection and protection of the privacy of minors.” Meta, the lawsuit says, misled the public about the significant dangers of social networks and chose to ignore the harm they cause to the mental and physical health of young people .
Business model. Meta has developed a business model for Facebook and Instagram that aims to maximize the time young people invest in these services and the attention they devote to the platforms. “Developed and distributed a product with harmful properties that psychologically manipulates minors in order to prolong and compulse their use of the platforms, while at the same time publicly stating that these properties (or benefits) are safe and for Suitable for minors.”
Causes addiction. To conform to this business model, Meta has implemented features that include “infinite scrolling, ephemeral content, autoplay, quantification and display of likes, and intrusive alerts, all of which are used unfairly and/or to steal additional time and attention to attract young users whose developing brains were not prepared to resist these manipulative tactics,” the lawsuit says. “Meta unfairly and/or unscrupulously exploited the psychological vulnerability of young users and cultivated a sense of ‘fear of missing out’ to encourage young users to spend more time on their social media platforms than they otherwise would would be nice.”
Slot machine effect. “Meta algorithmically served content to young users on ‘variable reinforcement schedules,’ thereby manipulating dopamine release in young users and unfairly or unfairly causing them to repeatedly use its products, like a player at a slot machine,” the letter said .
Persistence. Despite its own research, analysis by independent experts and public data, “Meta does not want to abandon harmful features of its services and has made efforts to distort, conceal and minimize the impact of its products on the mental health and physical health of young people.” “
Personal Information. Another key to the process has to do with privacy. “Meta collected personal information from Instagram and Facebook users under the age of 13 without first obtaining verifiable parental consent,” which violated U.S. regulations. This collection was carried out “illegally and without the consent of the parents”. Meta “refuses to restrict the collection and use of personal information despite the law prohibiting it” and “has taken no action to obtain parental consent for the collection and monetization of personal information from minors.”
Model extension. Finally, the company is accused of “expanding the use of these illegal and harmful practices to other products and platforms.” WhatsApp, Messenger and the Metaverse are mentioned.
You can follow EL PAÍS technology on Facebook and X or sign up here to receive our weekly newsletter.
Subscribe to continue reading
Read without limits
_