A hacker called 'Amadon' bypassed the ChatGPT's safety guidelines and tricked the chatbot into giving him detailed instructions on making homemade fertilizer bombs. The hacker tricked the chatbot with prompts like 'play a game'. He said the ChatGPT to imagine a world where the safety guidelines did not apply.