Face-Swapping Scams And Why Video Calls Can’t Be Trusted Anymore

AI-powered face-swapping scams make video calls dangerously deceptive.

Subscribe to our Telegram channel for the latest stories and updates.

There was a time when seeing someone on a video call was enough to confirm their identity. Their mouth moved in sync with their voice, their eyes followed yours, and their expressions felt natural. But with AI-powered face-swapping, even that can be faked.

Face-swapping technology has advanced so much that scammers can now alter their appearance in real time, making it almost impossible to tell if the person on screen is real. Paired with voice-cloning software, they can convincingly pretend to be someone else—whether it’s a romantic interest, a company executive, or even a family member.

How Scammers Are Using It

Face-swapping isn’t just for entertainment anymore. It’s a powerful tool for fraud. Scammers have used it in romance scams, tricking victims into believing they are in a relationship with someone who doesn’t exist. They create convincing video calls using AI to manipulate their appearance, gaining trust before asking for money. Many victims have lost thousands, some even their life savings.

It’s not just individuals being targeted. Businesses have also fallen victim to deepfake scams, where criminals impersonate CEOs or senior staff to trick employees into transferring money. Some companies have lost millions simply because they believed they were taking instructions from their boss on a video call.

Fake Faces, Fake Voices

Face-swapping alone is bad enough, but when combined with AI-generated voice cloning, the deception becomes even harder to detect. Scammers no longer need to rely on text messages or emails—they can make a phone call or video call sound just like the person they’re pretending to be.

With these tools, they can pressure victims into making urgent decisions, whether it’s transferring money, sharing sensitive information, or even providing access to secure accounts. The amounts lost in these scams can be staggering, with some victims handing over RM79,000 (USD 16,800) or more before realising they’ve been duped.

A Growing Global Problem

These scams aren’t just isolated incidents—they’re part of a much bigger criminal network. Money stolen from victims often funds serious crimes such as drug trafficking, human smuggling, and organised fraud rings. The scammers are constantly improving their tactics, making it harder for authorities to keep up.

As AI technology continues to evolve, verifying someone’s identity is no longer as simple as seeing them on a screen or hearing their voice. The safest approach? Always be sceptical, double-check through multiple channels, and never rush into financial decisions just because someone on a call tells you to.

Share your thoughts with us via TechTRP's Facebook, Twitter and Telegram channel for the latest stories and updates.

Previous Post

TECNO Unveils AI-Powered Smart Glasses Ahead Of MWC 2025

Next Post

Pokémon Legends: Z-A’s Latest Reveal Leaves One Iconic Trio Missing

Related Posts
Total
0
Share