London (dpa) – Artificial intelligence (AI) not only "hallucinates" false information, but also triggers something similar among vulnerable users, say health experts from leading universities researching the emerging phenomenon of "AI psychosis". Several recently published studies have found that AI can alter perceptions of reality as part of a "feedback loop" between chatbots and mental illness, reinforcing any delusional beliefs the patient may have. "While some users report psychological benefits, concerning edge cases are emerging, including reports of suicide, violence and delusional thinking linked to perceived emotional relationships with chatbots," a University of Oxford and University College London team said in a pre-print paper. "The rapid adoption of chatbots as personalised social companions" is not being scrutinized enough, the team warned. Another paper, by researchers at King’s College London and City University of New York, pointed to 17 cases of psychosis diagnosis following engagement with bots such as ChatGPT and Copilot. "AI may mirror, validate or amplify delusional or grandiose content, particularly in users already vulnerable to psychosis, due in part to the models’ design to maximise engagement and affirmation," the second team said. According to Nature, another science journal, psychosis can include "hallucinations, delusions and false beliefs" and "can be triggered by brain disorders such as schizophrenia and bipolar disorder, severe stress and drug use." Other recently published research showed that chatbots appear to encourage people who gave it indications of contemplating suicide. AI has become notorious for "hallucinations" – generating inaccurate or exaggerated responses to queries and prompts, with still more recent research suggesting this trait cannot be eradicated from the chatbots. The following information is not intended for publication dpa spr coh
(The article has been published through a syndicated feed. Except for the headline, the content has been published verbatim. Liability lies with original publisher.)