Skip to Content Facebook Feature Image

Meta adds parental controls for AI-teen interactions

ENT

Meta adds parental controls for AI-teen interactions
ENT

ENT

Meta adds parental controls for AI-teen interactions

2025-10-18 05:29 Last Updated At:05:40

Meta is adding parental controls for kids' interactions with artificial intelligence chatbots — including the ability to turn off one-on-one chats with AI characters altogether — beginning early next year.

But parents won't be able to turn off Meta's AI assistant, which Meta says will “will remain available to offer helpful information and educational opportunities, with default, age-appropriate protections in place to help keep teens safe.”

Parents who don't want to turn off all chats with all AI characters will also be able to block specific chatbots. And Meta said Friday that parents will be able to get “insights” about what their kids are chatting about with AI characters — although they won't get access to the full chats.

The changes come as the social media giant faces ongoing criticism over harms to children from its platforms. AI chatbots are also drawing scrutiny over their interactions with children that lawsuits claim have driven some to suicide.

Even so, more than 70% of teens have used AI companions and half use them regularly, according to a recent study from Common Sense Media, a nonprofit that studies and advocates for using screens and digital media sensibly.

On Tuesday, Meta announced that teen accounts on Instagram will be restricted to seeing PG-13 content by default and won’t be able to change their settings without a parent’s permission. This means kids using teen-specific accounts will see photos and videos on Instagram that are similar to what they would see in a PG-13 movie — no sex, drugs or dangerous stunts.

Meta said the PG-13 restrictions will also apply to AI chats.

Children's online advocacy groups, however, are skeptical about Meta's intentions.

“Meta’s new parental controls on Instagram are an insufficient, reactive concession that wouldn’t be necessary if Meta had been proactive about protecting kids in the first place," said James Steyer, Common Sense Media founder and CEO. “On top of this, Meta is taking its sweet time, waiting months to implement this new feature at a pivotal moment where every second counts.”

“For too long, this company has put the relentless pursuit of engagement over our kids’ safety, ignoring warnings from parents, experts, and even its own employees.”

Meta AI chatbots, Steyer added, “are not safe for anyone under 18."

Common Sense Media does not recommend minors use AI chatbots of any kind.

FILE - Meta Chief Product Officer Chris Cox speaks at LlamaCon 2025, an AI developer conference, in Menlo Park, Calif., April 29, 2025. (AP Photo/Jeff Chiu, File)

FILE - Meta Chief Product Officer Chris Cox speaks at LlamaCon 2025, an AI developer conference, in Menlo Park, Calif., April 29, 2025. (AP Photo/Jeff Chiu, File)

SANTA FE, N.M. (AP) — Two landmark jury verdicts against social media companies have arrived at the front of a wave of lawsuits alleging that the popular platforms endanger the mental health of children.

Financial penalties total $381 million in the two cases involving tech giant Meta in New Mexico and both Meta and YouTube in California. The verdicts highlight a growing shift in the public perception of social media companies and their responsibilities toward child safety.

But it may be too soon to tell whether litigation will change the way popular social media and messaging platforms function — or influence the complex algorithms that deliver content to billions of users worldwide.

Here are looming questions as related lawsuits approach trial.

The answer is not really — or, at least, not yet.

Meta — the owner of Instagram, Facebook and WhatsApp — says it had $201 billion in sales last year.

That revenue stream dwarfs the $375 million in civil penalties imposed on Tuesday by a jury in New Mexico with a verdict that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its social media platforms.

Meta said it disagrees with the verdicts and plans to appeal the jury's finding that it violated the state Unfair Practices Act.

And tech companies still are shielded from legal responsibility for posted content, based on Section 230 of the 1996 Communications Decency Act.

Investors are shrugging off the verdicts. Meta’s stock closed slightly higher Wednesday, although it is down about 8% year-to-date.

The verdicts this week don't mandate specific changes to the design of social media platforms, nor to the algorithms that make them tick.

But a second phase of the New Mexico trial in May, before a judge with no jury, could spell out changes for Meta's platforms for local users by court order.

A state district court judge will determine whether Meta created a public nuisance — and could impose restrictions and order the company to pay for programs that remedy potential harms to children.

New Mexico Attorney General Raúl Torrez, who filed the lawsuit against Meta in 2023, says his office wants improvements to Meta's enforcement of minimum age limits and removal of sexual predators — in part by lifting encryption on communication that can interfere with police work.

Meta says it continuously works to improve safety and already has made changes that phase out encryption on Instagram and limit access to explicit content by teenagers, block unsolicited messages to children from adults and help young users manage time spent on its platforms and avoid sleep disruptions.

Both the California and New Mexico trials highlighted the addictive properties of platform algorithms and the negative impacts on child mental health.

In New Mexico, a jury in Santa Fe arrived at the $375 million fine against Meta by endorsing the maximum penalty of $5,000 per violation of state consumer protection law — multiplied by thousands of social media accounts for children under 18.

Prosecutors intend to pursue more damages in that trial's second phase, while an appeal could delay payment — or reverse penalties.

In California, the jury ruled that Meta and Google's video streaming platform YouTube must pay at least $3 million in damages to a 20-year-old woman who says she became addicted to social media as a child, exacerbating her mental health struggles. TikTok and Snap settled before the trial began.

California jurors recommended an additional $3 million in punitive damages pending a judge's final review.

Google defends YouTube as a responsibly built streaming platform, and not a social media site.

The California verdict has much broader legal and financial implications. The case was designated as a bellwether test that might guide the resolution of other lawsuits. There are thousands of those lawsuits pending, including hundreds in California.

The New Mexico verdict may be an early indicator for lawsuits brought by other publicly elected prosecutors.

Attorneys general in more than 40 states have filed suit against Meta, claiming it is contributing to a mental health crisis among young people. Most are pursuing remedies in U.S. federal court.

A recording of Meta Founder and CEO Mark Zuckerberg's deposition is played for the jurors on Wednesday, March 4, 2026, in Santa Fe, N.M. (Jim Weber/Santa Fe New Mexican via AP, Pool)

A recording of Meta Founder and CEO Mark Zuckerberg's deposition is played for the jurors on Wednesday, March 4, 2026, in Santa Fe, N.M. (Jim Weber/Santa Fe New Mexican via AP, Pool)

Meta attorney Kevin Huff makes closing arguments, Monday, March 23, 2026, in state court, in Santa Fe, N.M., in a trial where the social media conglomerate is accused of misleading its users about how safe its platforms are for children. (Eddie Moore/The Albuquerque Journal via AP, Pool)

Meta attorney Kevin Huff makes closing arguments, Monday, March 23, 2026, in state court, in Santa Fe, N.M., in a trial where the social media conglomerate is accused of misleading its users about how safe its platforms are for children. (Eddie Moore/The Albuquerque Journal via AP, Pool)

Recommended Articles