Meta Ordered to Pay $375 Million for Putting Children in Danger

A court in the US state of New Mexico has ordered Meta to pay $375 million after a jury found the company put children at risk and lied to the public about it. Meta owns Facebook, Instagram and WhatsApp.
This is a historic moment, it is the first time a US state has successfully sued Meta over child safety.

What the Federal Judge Decided

The judge decided that Meta broke the New Mexico’s consumer protection law by misleading people about how safe its platforms are for young users.

Over a seven week trial, jurors saw internal Meta documents and heard from former company employees. The evidence showed that Meta knew child predators were using its platforms and did not stop them.

The Whistleblower Who Spoke Out

One of the most powerful voice in the court belonged to Arturo Béjar. He was former top engineer at Meta but left the company in 2021. Later, he became a whistleblower.

He told the court about tests he did on Instagram. Those tests showed that underage users were being shown sexual content.

He also shared something very personal. His own young daughter was approached by a stranger on Instagram who tried to engage her in sexual activity.

The Numbers Were Shocking

Prosecutors presented internal research from Meta to the jury. The research found that at one point, 16% of all Instagram users reported seeing unwanted nudity or sexual content in just one week.

That means about one out of every six users saw this kind of content and many of those users were children.

How Was the Fine Calculated?

The jury found that Meta has broke the New Mexico’s Unfair Practices Act thousands of times. Each violation can result in a penalty of up to $5,000. When all the violations were added together, the total came to $375 million.

What Meta Says

Meta disagrees with the jury’s decision and plans to appeal against the order. Meta company spokesperson said: “We work very hard to keep people safe on our platforms and remain confident in our record of protecting teens online.” company also pointed to steps it has taken in the recent years. These include launching Teen Accounts on Instagram in 2024 and adding a new feature last month that alerts parents if their child searches for content related to self-harm.

The Bigger Problem

This case is only one part of a much larger fight. Meta is also facing a separate trial in Los Angeles. In that case, a young woman claims she became addicted to Instagram as a child because the app was deliberately designed to keep her hooked.

Across the United States, thousands of similar lawsuits are moving through the court system.

New Mexico first sued Meta in 2023. The state claimed the company used its recommendation algorithms  the technology that decides what content a user sees  to push young people toward sexually explicit material, child abuse content, and even sex trafficking.

What the Attorney General Said

New Mexico Attorney General Raul Torrez spoke strongly after the verdict.

He said that Meta’s leaders knew their products harmed children. They ignored warnings from their own workers and lied to the public about what they knew.

He added that today, the jury joined families, teachers, and child safety experts in saying enough is enough.

Meta has said it will appeal the decision. The case is expected to have major consequences for how tech companies handle child safety across the country.

Leave a Reply

Your email address will not be published. Required fields are marked *