According to internal company documents released late Wednesday, Meta estimates that about 100,000 children who use Facebook and Instagram are exposed to sexual harassment online every day, including “images of adult genitals.”
The unsealed complaint contains several allegations against the company based on information the New Mexico Attorney General's Office obtained from presentations by Meta employees and communications between employees. The documents describe an incident in 2020 when the 12-year-old daughter of an Apple executive was recruited through IG Direct, Instagram's messaging product.
“Something like this upsets Apple so much that it is threatening to remove us from the App Store,” a Meta employee said, according to the documents. A senior Meta employee described how his own daughter was recruited to testify before the US Congress via Instagram late last year. His efforts to fix the problem were ignored, he said.
The filing is the latest in a lawsuit filed Dec. 5 by the New Mexico Attorney General's Office that alleges Meta's social networks have become marketplaces for child abuse. Raúl Torrez, the state's attorney general, has accused Meta of allowing adults to find, message and groom children. The company has denied the lawsuit's claims, saying it “misrepresents our work through selective citations and carefully selected documents.”
Meta issued a statement in response to Wednesday's filing: “We want teens to have safe, age-appropriate online experiences, and we have over 30 tools to support them and their parents.” We've spent a decade doing this “To work on these issues and to hire people who have dedicated their careers to ensuring the safety and support of young people online.”
The lawsuit also referenced a 2021 internal child safety presentation. According to the lawsuit, one slide states that Meta “underinvests in minor sexualization on IG, particularly in relation to sexualized comments on content posted by minors.” Not only is this a horrific experience for creators and bystanders, it is also an opportunity for bad actors to identify and connect with one another.”
The complaint also highlights Meta employees' concerns about child safety. In an internal meta-chat in July 2020, an employee asked, “What specifically are we doing for childcare (something I just heard happens a lot on TikTok)?” According to the complaint, he received an answer: “Somewhere between zero and negligible.”
Meta's statement also said the company has “taken significant steps to prevent youth from experiencing unwanted contact, particularly from adults.”
The New Mexico lawsuit follows a Guardian investigation in April that revealed Meta was failing to report or detect the use of its platforms for child trafficking. The investigation also revealed how Messenger, Facebook's private messaging service, is used as a communication platform for human traffickers to buy and sell children.
Meta employees discussed using Messenger “to coordinate human trafficking activities” and to facilitate “every phase of human exploitation (recruitment, coordination, exploitation) is represented on our platform,” according to documents included in the lawsuit.
However, an internal email from 2017 describes executives' resistance to scanning Facebook Messenger for “harmful content” because it would put the service “at a competitive disadvantage compared to other apps that may offer more privacy.” it in the lawsuit.
In December, Meta received widespread criticism for introducing end-to-end encryption for messages sent on Facebook and Messenger. Encryption hides the contents of a message from anyone other than the sender and intended recipient by converting text and images into unreadable ciphers that are decrypted upon receipt. Child safety experts, policymakers and law enforcement have argued that encryption hinders efforts to rescue child trafficking victims and prosecute offenders. Privacy advocates praised the decision for protecting users from surveillance by governments and law enforcement agencies.