Real-Time Deepfake Romance Scams and Centering Things On the Web
Some notes & quotes from recent reads:
The Real-Time Deepfake Romance Scams Have Arrived
Quotes:
“You’re looking different with that beard and stuff gone,” the woman says in an American accent as the conversation gets going. The man doesn’t miss a beat. “I told you I was going to shave my beard so I will look good.”
Except, he isn’t who he claims to be. His videofeed is a lie. And—beard or not—the face the woman can see over the video call is not his: It’s a deepfake.
In reality, the man is a scammer using face-swapping technology to totally change his appearance in real time. In a video of the call—filmed by the scammer’s accomplice likely thousands of miles away from the woman—his real face can be seen on this laptop alongside the fake persona as he speaks to his victim.
This self-shot video is one of scores posted online by scammers known as Yahoo Boys, a loose collective of con artists, often based in Nigeria. The video reveals how they are using deepfakes and face-swapping to ensnare victims in romance scams, building trust with victims using fake identities, before tricking them into parting with thousands of dollars. More than $650 million was lost to romance fraud last year, the FBI says.
Notes:
With each release of whizbang new generative AI tools, and with each fresh application of said tools to customized software that allows it to be implemented by anyone (no matter their level of familiarity and skill with technology), the more we see scams that use this tech to great effect.
Initially it was grabbing people’s social accounts to post spam and ask, via direct message, for money, then it was calling parents, pretending to be their children, using replicated voices so that those parents would send money when their seeming-kids said they were in trouble, and now we’re reached the point where onscreen videos can be edited in real-time so that it seems like someone’s child (or other loved one) is video-calling to ask for money, or—as is the case with some of these romance fraud scams—folks can seem to be someone they’re not, someone desirable to the person on the other end, and then over time milk them for cash.
It’s the same old scam over and over again, basically, but these new tools are making them more effective in some cases, and because this tech is unevenly distributed right now (some people having access to it and knowing how to use it, other people not even knowing this sort of thing is possible), there’s an arbitrage opportunity for criminals who are keen to grab what they can before that gap is closed.
Keep reading with a 7-day free trial
Subscribe to Aspiring Generalist to keep reading this post and get 7 days of free access to the full post archives.