What you need to know
- A YouTuber managed to get ChatGPT to generate a set of keys for Windows 95, one of which worked.
- ChatGPT rejected a direct request to generate keys, but YouTuber “Enderman” outwitted the chatbot.
- The structure of Windows 95 keys has been known for years, so Enderman was able to get ChatGPT to generate keys without the chatbot understanding what it was generating.
ChatGPT is a powerful tool that’s used to generate code, text, content, and images. It’s also on the receiving end of many memes and jokes. The latter is the case here, as OpenAI’s chatbot was tricked into generating keys for Windows 95.
YouTuber Enderman managed to get ChatGPT to create 30 keys for Windows 95, one of which was able to activate the operating system within a virtual machine (via Tom’s Hardware).
ChatGPT has guardrails in place to stop it from performing certain tasks. When directly asked to generate Windows 95 keys, the chatbot said no and suggested a supported version of Windows.
But Enderman got around the limitation by tricking ChatGPT. Rather than asking for Windows 95 keys, the YouTuber asked the chatbot to make 30 strings of characters that followed specific formatting.
The structure of Windows 95 keys has been known for years, but ChatGPT failed to connect the dots. It then generated the “random” numbers that were actually attempts to make a key for Windows 95.
Only one of the 30 keys was able to activate a fresh install of Windows 95 in a virtual machine. But that was due to a mathematical limitation of ChatGPT, not some security feature.
“The only issue keeping ChatGPT from successfully generating valid Windows 95 keys almost every attempt is the fact that it can’t count the sum of digits and it doesn’t know divisibility,” said Enderman.
Windows 95 hasn’t been supported since 2001, so it’s essentially harmless to generate keys for the operating system. It is interesting, however, to see that ChatGPT can be tricked into doing such a task. A similar process could be used to bypass other guardrails.
ChatGPT and other AI chatbots don’t actually understand the tasks they complete, at least not in the same way that a human does. In fact, chatbots don’t understand anything on an intellectual level. Instead, they study conversations and data to generate responses that are likely to be the correct result.
After successfully using ChatGPT to generate Windows 95 keys, Enderman thanked the chatbot for its help. ChatGPT apologized for the confusion and claimed that it could not provide product keys as that would be illegal and against OpenAI’s policies. Of course, the chatbot had generated keys, but again, it doesn’t understand things.