Mark Zuckerberg says criminal behavior on Facebook is inevitable | Mark Zuckerberg


Harms to children, such as sexual exploitation and mental health harm, are inevitable on Meta platforms, the company’s CEO Mark Zuckerberg and Instagram leader Adam Mosseri said in recorded statements played at a trial in New Mexico on Tuesday and Wednesday.

“I just think that if you’re serving billions of people, the unfortunate reality is that a very small percentage of them are going to be criminals, and we should work as hard as we can to stop that activity,” Zuckerberg said. “I don’t think the standard for our platforms is to assume they will ever be perfect.”

Meta apps, including Facebook, Instagram and WhatsApp, are among the most popular in the world, each with 3 billion monthly active users.

The lawsuit has pitted the social media giant against New Mexico’s attorney general, who alleges that Meta platforms put profits and user engagement over child safety. Raúl Torrez has accused the company of knowingly allowing predators to use Facebook and Instagram to exploit children. Meta disputes the allegations, citing changes it has introduced, including teen accounts with default protections that debuted in 2024. The trial, which began in early February, is expected to last about seven weeks.

“We have strong, long-standing rules against child exploitation and have invested billions to combat it, both through proactive detection technology and security features designed to prevent harm,” a Meta spokesperson said.

“We provide industry-leading transparency, regularly sharing data on how much infringing content we remove and how much we lose. No system can be perfect and we have never claimed to be.”

Jurors were shown taped statements by Zuckerberg and Mosseri filmed between March and July of last year. The jury also heard that family members of Meta employees had experienced sexual solicitation on Instagram.

Prosecutors also presented evidence that the company estimated in 2020 that 500,000 children received sexually inappropriate communications on Instagram each day, including grooming, in which adults attempt to engage in relationships with minors for sexual purposes.

In a statement, a Meta spokesperson said the technology the company used at the time was overly broad and cautious, and as such, interactions that were not inappropriate were included in the count.

The company identified the “People You May Know” algorithm, which recommends accounts that users connect with, as the main driver of these interactions, and the tool was used to uncover victims in 79% of cases identified in 2018. At that time, around 30% of adults whose accounts were disabled for targeting children had returned to the platform and resumed that behavior, the court heard.

Jurors heard that Zuckerberg authorized end-to-end encryption for Facebook Messenger in 2023 despite warnings from child safety groups Thorn and the National Center for Missing and Exploited Children (NCMEC) that the move could pose risks to children. In a recorded statement played at the trial, he said the privacy that encryption offers users was a more pressing issue. Encryption prevents anyone other than the sender and intended recipient from seeing messages by converting text and images into unreadable ciphers that are decrypted upon receipt. The content is not stored on Meta servers.

A company spokesperson added that Meta can still review and take action on encrypted messages if they are reported by a user.

Child safety groups and authorities have warned that encrypting Messenger allows predators to share images of child sexual abuse without detection. Early in the trial, a law enforcement official testified that reports of child sexual abuse material from the platform decreased after encryption.

“I think what people want is end-to-end encrypted messaging services,” Zuckerberg said in a recorded statement in March 2025. “They really care about privacy.”

Mosseri said in its statement that the company has “developed technology that allows us to find accounts that have exhibited potentially suspicious behavior, for example, an adult account that may have been blocked by another youth, and prevent those accounts from interacting with youth accounts.”

“We use a variety of signals to identify adults who have displayed potentially suspicious behavior and avoid recommending these accounts to teens through Facebook’s ‘People You May Know’ and Instagram’s ‘Accounts You Should Follow’ features,” a Meta spokesperson said.

“In 2025, we used these signals to identify more than 265 million Facebook accounts and more than 135 million Instagram accounts that had displayed potentially suspicious behavior, and proactively prevent them from finding, following or interacting with teenagers.”

An internal presentation discussed at trial claimed that Instagram’s safety and well-being team did not always prevent teen accounts from being recommended to potential offenders and vice versa. An internal audit in December 2022 showed that Meta continued to recommend minor accounts to some adults.

In September 2024, Meta introduced Teen Accounts, which automatically place users under 18 on stricter settings on Instagram, Facebook, and Messenger, including making profiles private by default and limiting who can message them. Researchers have identified gaps in those protections, including exposure to harmful videos through hashtags or recommendations and cases where safety features did not work as intended.

“I certainly want to address any issue that is even remotely as serious as something like sexual solicitation…Any negative action that occurs offline, and to some extent, occurs online as well,” Mosseri said. “We are connecting billions of people. That will mean good and bad things will happen.”

Add Comment