Menu
Inshorts
For the best experience use inshorts app on your smartphone
inshortsinshorts
Gemini, Grok, GPT-4o teach how to die by suicide if users simply rephrase prompts: Study
short by Ashley Paul / on Wednesday, 18 March, 2026
A new study found that popular AI chatbots like Gemini, Grok, GPT-4o and Claude Sonnet can teach users how to die by suicide if the users simply rephrase their prompts. The study noted how a simple rephrasal made GPT-4o mini go from 0.97% unsafe to 96.62% unsafe. It said that chatbots detect "triggering cues", without actually understanding that they're unsafe.
read more at arXiv