What if the friendly YouTube personality offering tips on how to double your crypto wasn’t just a scammer, but didn’t even exist?
That’s exactly what cybersecurity researchers uncovered in one of the most aggressive “Scam-Yourself” campaigns to date. Attackers used AI-generated influencers and hijacked YouTube accounts to convince people to drain their own wallets, all while they thought they were receiving expert financial advice. The result? Victims followed step-by-step instructions and ended up sending cryptocurrency straight into the attackers’ pockets.
One of the most prolific fake personas was named Thomas Harris – or sometimes Thomas Roberts, Oscar Davies, or another invented identity. In over 500 separate videos, these deepfake characters promised access to tools like “ChatGPT AI Charts” or “smart trading bots” for use on TradingView, a legitimate charting platform. Victims were told they could profit by exploiting price mismatches between blockchains; all they had to do was paste some code into a web-based IDE (Integrated Development Environment) and fund a smart contract.
In reality, that code simply rerouted the victim’s funds to the scammer’s wallet.
The campaign wasn’t sloppy; it was professionally executed. Hijacked YouTube accounts were scrubbed, renamed, and designed to look like reputable crypto influencers. They linked to real videos from TradingView’s official channel to create a sense of legitimacy. Viewers who didn’t dig deep would have no idea the channel had been compromised.
And these scams worked because they were engineered to exploit trust at every level:
- Trust in YouTube: People trust content recommended by YouTube’s algorithm. These scammers used paid ads and search-optimized descriptions to promote their videos to the exact audiences most likely to fall for them: amateur crypto investors looking for tips or trading hacks.
- Professional polish: The AI-generated personas didn’t look or sound fake. They were confident, clearly spoken, and used slick visuals and screen recordings to walk viewers through each step.
- Familiar trading tools: Victims weren’t being pushed to some shady website – they were interacting with tools they already knew, like TradingView or browser-based IDEs. That familiarity lowered their defenses.
- DIY infection: Most importantly, the victims executed every step themselves. No malware downloads. No email phishing. No suspicious pop-ups. Just a persuasive tutorial that led them to voluntarily hand over access to their crypto.
Read more: How Security Expert Troy Hunt Got Phished – and Why 2FA Didn’t Save Him
This kind of attack has gone mainstream. According to the Q1/2025 Gen Threat Report, Scam-Yourself Attacks – where people are manipulated into infecting their own devices or giving up money – rose 66% quarter-over-quarter in the U.S. Researchers attribute that spike to “a new evolution of attackers using AI, deepfake influencers and hired actors to carry out malicious campaigns through compromised YouTube accounts”.
To make matters worse, the scam sites used typo-squatted domains that mimicked trusted names, like tradlngview[.]com instead of tradingview.com. These clones suppressed standard security warnings you’d normally see when pasting or compiling potentially dangerous code. That meant victims saw no red flags unless they scrutinized the URL or noticed inconsistencies in the interface, which most didn’t.
The lesson here is painfully clear: scams no longer rely on broken English, sketchy websites, or crude deception. They’re clean, convincing, and often delivered by what appears to be a friendly face with a smart tip.
Read more: FBI Warning: Stop Using These Unsupported Linksys Routers Now
So, if you see a crypto guru on YouTube offering fast profits or insider tricks, slow down. Ask yourself: Do I really know who this is? Is the source legitimate? because the next time a deepfake influencer offers you financial advice, the only thing you might be investing in is someone else’s getaway plan.
[Image credit: DALL-E]