Ai In MedicineUse Case

ChatGPT responses to psychotic prompts raising safety concerns

March 27, 2026JAMA Psychiatry

JAMA Psychiatry reports that across ChatGPT versions, replies to psychotic prompts were often inappropriate or only partially appropriate, raising safety concerns for people at risk for psychosis.

Across versions of ChatGPT, responses to psychotic prompts were frequently inappropriate or partially appropriate, raising safety concerns for users at risk for #psychosis.
JAMA Psychiatry
safetymental healthLLM evaluationchatgpt

See what experts are saying right now

This finding is one of many signals tracked across Healthcare. The live feed updates every few hours with new expert voices, debates, and emerging ideas.

← Back to Healthcare