+91 9559313243

News Details


WormGPT: The Growing Threat of Sophisticated Cyber Attacks

With artificial intelligence (AI) all the rage these days, it’s perhaps not surprising that technology is being repurposed by criminals to their advantage, and in turn provides a way to accelerate cybercrime.

A new prolific intelligence cybercrime tool called WormGPT has been announced on underground forums as a way for enemies to attack advanced phishing and Business Email Compromise (BEC) attacks, SlashNext discovered.

"This tool serves as a black hat replacement for the maliciously designed GPT model," said Daniel Kelley, security researcher. "Cybercriminals can use such techniques to create fake emails that are useful and impersonate the intended recipient, increasing the chances of success."

The authors of the software describe it as "The notorious ChatGPT’s worst enemy", "It lets you do anything. Illegal".

In the hands of malicious actors, tools like WormGPT can be powerful weapons, especially OpenAI ChatGPT and Google Bard have advanced to combat the abuse of native language patterns (LLMs) for fake websites in phishing emails and malware creation, Check Point said in a report released this week: "The limit for malicious use in cybersecurity is lower than ChatGPT. Yes, therefore, it is easier to create malicious content using Bard’s capabilities.

An Israeli cyberattack in early February security company announced how cybercriminals are using its own API to circumvent ChatGPT restrictions, money stolen for business and selling brute force software to spend a lot of money on ChatGPT is out of the question even threat posed by artificial intelligence, new cybercriminals are quick without resources and allows it to attack masses.

To make things worse, criminals are pushing for "jailbreak" on ChatGPT, creating special instructions and techniques designed to control the device to generate messages such as information leaks, inappropriate content creation, and violating the law. urinating

The success of the technology, called PoisonGPT, relies on the precondition of sending a leukotomy sample on behalf of a well-known company, in this case EleutherAI (the company behind GPT-J) fault.

  • Vauld
  • Bitbns
  • Payzigo
  • Teqfox
  • Instantpay
  • Mosambee