Jan 23 (Reuters) – Meta Platforms said on Friday it will suspend teenagers' access to its existing AI characters across all of its apps worldwide, as it builds an updated iteration of those for teen users. "Starting in the coming weeks, teens will no longer be able to access AI characters across our apps until the updated experience is ready," Meta said in an updated blog post on minors' protection. The new version of characters for teens will come with parental controls once it becomes available. In October, Meta previewed parental controls that allow them to disable their teens' private chats with AI characters, adding another measure to make its social media platforms safe for minors after fierce criticism over the behavior of its flirty chatbots. The company on Friday said that these controls have not been launched yet. Meta had also said that its AI experiences for teens will be guided by the PG-13 movie rating system, as it looks to prevent minors from accessing inappropriate content. U.S. regulators have stepped up scrutiny of AI companies over the potential negative impacts of chatbots. In August, Reuters reported how Meta's AI rules allowed provocative conversations with minors. (Reporting by Juby Babu in Mexico City; editing by Alan Barona)
(The article has been published through a syndicated feed. Except for the headline, the content has been published verbatim. Liability lies with original publisher.)
Kolkata (West Bengal) [India], March 12: Kolkata-based men’s grooming and wellness startup Yes Sir has secured…
New Delhi [India], March 12: We are thrilled to announce the official launch of the…
Noida (Uttar Pradesh) [India], March 11: Demonstrating nearly two decades of excellence in design education…
Mumbai (Maharashtra) [India], March 11: Resilience personified: Priya Munjal, proud mother of two, has shed…
New Delhi [India], March 11: In today’s rapidly evolving business landscape, a new generation of…
Mumbai (Maharashtra) [India], March 11: India makes an indelible mark on the international stage as…