Jan 2 (Reuters) – Elon Musk's xAI artificial intelligence chatbot Grok said on Friday lapses in safeguards had resulted in "images depicting minors in minimal clothing" on social media platform X and that improvements were being made to prevent this. Screenshots shared by users on X showed Grok's public media tab filled with images that users said had been altered when they uploaded photos and prompted the bot to alter them. "There are isolated cases where users prompted for and received AI images depicting minors in minimal clothing," Grok said in a post on X. "xAI has safeguards, but improvements are ongoing to block such requests entirely." "As noted, we've identified lapses in safeguards and are urgently fixing them—CSAM is illegal and prohibited," Grok said, referring to Child Sexual Abuse Material. Grok gave no further details. In a separate reply to a user on X on Thursday, Grok said most cases could be prevented through advanced filters and monitoring although it said "no system is 100% foolproof," adding that xAI was prioritising improvements and reviewing details shared by users. When contacted by Reuters for comment by email, xAI replied with the message "Legacy Media Lies". (Reporting by Arnav Mishra and Akash Sriram in Bengaluru, Editing by Timothy Heritage)
(The article has been published through a syndicated feed. Except for the headline, the content has been published verbatim. Liability lies with original publisher.)
Interface of Katalon True Platform and Katalon AI Assistant. Source: Katalon Atlanta (Georgia) [USA], April…
New Delhi [India], April 09: If you’ve been hearing the name Nakli Yash Mehta repeatedly,…
New Delhi [India], April 09: At a time when political analysis is often reduced to…
New Delhi [India], April 09: The ET Now Business Conclave and Awards 2026 brought together…
On World Homeopathy Day, Padma Shri Dr. Mukesh Batra calls for integrating homeopathy into India’s…
Mumbai (Maharashtra) [India], April 08: Numen Rising, founded by Madhurima Nigam, introduces IMMERSIV- The Floating…