A malicious chatbot created by a skilled hacker as a dedicated assistant for cybercriminals. According to SlashNext, an email security provider that tested the chatbot, the developer of WormGPT is offering access to the program for sale in a well-known hacking forum.
“Malicious actors are creating their own custom modules similar to ChatGPT, but easier to use for bad intentions,” the company stated in a blog post.
The hacker seems to have first introduced the chatbot in March and then officially launched it last month. Unlike ChatGPT or Google’s Bard, WormGPT lacks any safeguards to prevent it from responding to harmful requests.
The developer of this project wants to create a ChatGPT alternative that allows users to engage in illegal activities and easily sell them online later on. WormGPT empowers individuals to engage in a plethora of black hat activities, enabling them to partake in malicious endeavors right from the comfort of their own home.
The developer of WormGPT has also shared screenshots that demonstrate how you can request the bot to create Python malware and offer suggestions for devising harmful attacks.
For the creation of the chatbot, the developer utilized GPT-J, a powerful and open-source large language model developed in 2021. The model was then trained on data concerning malware creation, resulting in WormGPT.
SlashNext tested the capabilities of WormGPT by assessing its ability to craft a compelling email for a business email compromise (BEC) scheme, a deceptive phishing attack.
“The results were unsettling. WormGPT produced an email that was not only remarkably persuasive but also strategically cunning, showcasing its potential for sophisticated phishing and BEC attacks,” SlashNext said.