Tech
How Cryptocurrency Scammers Are Embracing New AI Technology – DL News
- Pig slaughter scammers are using new AI tools to trick victims.
- Prosecutors warn of a growing wave of new cases sweeping the world.
- Crypto scams cost victims more than $2.5 billion in 2022.
We can’t say we didn’t see it coming.
In the months since OpenAI launched ChatGPT and upended the startup ecosystem, the technology has been labeled a threat journalism, film scriptAND national security.
It should come as no surprise, then, that scammers and cybercriminals are exploiting large language models to steal cryptocurrency from their victims.
Yet, it happened to a guy who used Tandem, a language-sharing app that doubles as a dating platform, when he started a conversation online.
The other person immediately convinced him to move the chat to WhatsApp.
Whatever the intended end result, the scam was bungled by a long text revealing, “as a linguistic model of ‘me,’ I do not experience feelings or emotions like humans,” essentially exposing the use of an advanced chatbot that, if it wasn’t ChatGPT, it would have had striking similarities to it.
Alarmed, the potential victim contacted a cybersecurity company Sophoswhich confirmed that fears that criminals could exploit the new technology had come true.
“Now we can say that, at least in the case of pig slaughter scams, this is actually happening”
—Sean Gallagher
“We can now say that, at least in the case of pig slaughter scams, this is actually happening,” said Sean Gallagher, principal threat researcher at Sophos, in a report highlighting the case.
Join the community to get our latest stories and updates
READ NOW: Recovering scammed assets should be easier with blockchain data, but it isn’t
Pig slaughter scams are scams where scammers try to trick victims into investing in cryptocurrency platforms for a certain period of time, increasing their earnings before cashing out, leaving victims penniless and ashamed of having been defrauded.
The FBI estimates that crypto scams like these cost victims more than $2.5 billion in 2022.
However, using ChatGPT and similar tools will allow criminals “to scale up the situation exponentially worse than what we’re already seeing,” said Erin West, a California prosecutor who made a name for herself recovering millions of dollars in stolen cryptocurrency.
The technology will allow criminals to better personalize the conversations they have with victims, overcome language barriers, reach more people “and cause more destruction,” he said. DL Newsadding: “Adding ChatGPT makes it much more dangerous.”
This stern warning comes following reports of cybercriminals using similar tools to write malware at great speed.
“It wasn’t a question of if but when”
Security companies and law enforcement are shocked by this news, but no one we spoke to was surprised.
“It wasn’t a question of if, it was a question of when,” said Bobby Cornwell, vice president, strategic partner enablement and integration, at cybersecurity firm SonicWall. DL News.
These stories have been brewing since OpenAI unveiled ChatGPT in November.
The advanced chatbot was trained on oceans of data. Drawing on massive data sets, consisting of things like books, codes, websites, and articles, it can write complex codes and even hold conversations based on user input.
READ NOW: How Bitcoin swings contributed to a nearly nine-fold increase in cryptojacking attacks in Europe
It was a guaranteed success for OpenAI. By January, it had already reached approx 100 million monthly usersmaking it the fastest growing app in history.
People have used it for everything from writing online dating profiles AND Scary stories to generate video scripts and debug code.
Tech giants, including Google and Meta, have responded by accelerating the development of their own conversational AI tools: Bard AND Lama.
(LLaMA is not affiliated with DL News parent company DefiLlama.)
In July, Tesla CEO Elon Musk launched a new company, x.AIwith the goal of creating a less “woke” version of ChatGPT than the OpenAI version.
“The danger of training AI to wake up — in other words, to lie — is deadly,” Musk tweeted. December.
AI companies are estimated to have raised more than $25 billion from investors in the first half of 2023. Crunchbase.
The dark side of ChatGPT
Instead of celebrating, OpenAI CEO Sam Altman embarked on a world tour of sorts, warning of the dangers of limitless artificial intelligence.
He testified to the US Congress and told reporters he lost sleep over his fear of OpenAI “did something really bad” letting out the proverbial genius of artificial intelligence.
NOW READ: I Had My Irises Scanned by Sam Altman’s Worldcoin Orbs So You Don’t Have to: Here’s What Happened
Whether his statements arise from real fear or are part of an elaborate Public Relations Strategy remains uncertain.
Regardless of the motivation behind Altman’s statements, cybercriminals have already used ChatGPT or similar tools to fuel their schemes.
Cybercriminals even claim to have created their own versions of ChatGPT, stripped of the security measures installed by OpenAI.
They are advertising them on the dark web as tools to enhance phishing attacks and write malware, Wired reported in early August.
NOW READ: The big hunt is back: Ransomware gangs predicted $900 million haul this year
“Cybercriminals and the threat actor community are tech-savvy, motivated, and tend to be early adopters of new technology trends when they see benefits and improvements in how they can evolve their tools, techniques, and practices,” said Gary Alterson, vice president of managed security at cybersecurity firm Kivu Consulting. DL News.
“It’s no surprise then that threat actors are embracing generative AI because they can achieve the same productivity gains and potentially improve their ability to hide, conduct reconnaissance, create malicious code, fool people and security software, and more.”
These developments have left cybersecurity experts and law enforcement agencies scrambling to address the threat.
Cybersecurity companies need to “build more intelligence into our systems to search for AI-generated content,” Cornwell said, adding that creating such countermeasures will take time.
“It’s hard to fight a monster so big and so rich”
—Erin West
West said that while she and other crime fighters can warn of these threats, the criminal syndicates behind pig slaughter scams were difficult to defeat even before they harnessed AI.
According to the article, many criminal networks are run by Chinese nationals operating in Cambodia, Laos and Myanmar. Global Anti-Fraud Organization.
The billions made from their crimes allow them to pay local law enforcement and avoid international sanctions, West said.
“It is difficult to fight such a big and rich monster,” he said.
OpenAI did not respond to our requests for comment.
Eric Johansson is DL News’ News editor based in London. He deals with crypto culture, investments and politics. You can contact him at eric@dlnews.com or on Telegram at ericjohanssonlj.