Menu
Inshorts
For the best experience use inshorts app on your smartphone
inshortsinshorts
Hacker jailbreaks ChatGPT, releases 'GODMODE' version
short by Hiral / on Friday, 31 May, 2024
A white hat hacker, identified as 'Pliny the Prompter', has released a jailbroken version of ChatGPT called 'GODMODE GPT'. This version of GPT-4o, the latest large language model (LLM) released by OpenAI, has "a built-in jailbreak prompt" that circumvents most guardrails, they claimed. The hacker shared screenshots of some prompts on X, which showed ChatGPT giving advice on illegal activities.
read more at Hindustan Times