X ChatGPT can be tricked into money laundering crimes

What you need to know

  • A new experiment reveals that OpenAI’s ChatGPT tool can be tricked into helping people commit crimes, including money laundering and the exportation of illegal firearms to sanctioned countries.
  • Strise’s co-founder says asking the chatbot crude questions indirectly or taking up a persona can trick ChatGPT into providing crime advice.
  • OpenAI says it’s progressively closing loopholes leveraged by bad actors to trick it into doing harmful things.

Over the years, we’ve witnessed people leveraging AI-powered tools to do things that wouldn’t ordinarily be considered conventional. For instance, a study revealed ChatGPT can be used to run a software development company with an 86.66% success rate without prior training and minimal human intervention. The researchers also established that the chatbot could develop software in under 7 minutes for less than a dollar.

Users can now reportedly leverage ChatGPT’s AI smarts to solicit advice on how to commit crimes (via CNN). The report by Norwegian firm Strise indicates that the crimes range from money laundering to the exportation of illegal firearms to sanctioned countries. For context, Strise specializes in developing anti-money laundering software broadly used across banks and other financial institutions.

By admin

Leave a Reply