A white hat hacker, identified as 'Pliny the Prompter', has released a jailbroken version of ChatGPT called 'GODMODE GPT'. This version of GPT-4o, the latest large language model (LLM) released by OpenAI, has "a built-in jailbreak prompt" that circumvents most guardrails, they claimed. The hacker shared screenshots of some prompts on X, which showed ChatGPT giving advice on illegal activities.