Menu
Inshorts
For the best experience use inshorts app on your smartphone
inshortsinshorts
ChatGPT tricked by hacker into giving instructions for making bomb
short by / on Monday, 16 September, 2024
A hacker called 'Amadon' bypassed the ChatGPT's safety guidelines and tricked the chatbot into giving him detailed instructions on making homemade fertilizer bombs. The hacker tricked the chatbot with prompts like 'play a game'. He said the ChatGPT to imagine a world where the safety guidelines did not apply.
read more at Latestly