4

Voice Clones Have Crossed the Uncanny Valley [Sponsor]

 6 months ago
source link: https://www.macstories.net/sponsored/voice-clones-have-crossed-the-uncanny-valley-sponsor/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Mar 11, 2024 — 13:25 CUT

Voice Clones Have Crossed the Uncanny Valley [Sponsor]

image3-1706477415636.png

Now, don’t get offended, but – you aren’t as good at clocking deepfakes as you think you are. 

And it’s not just you–nobody’s that good at it. Not your mom, or your boss, or anyone in your IT department. 

To make matters worse, you probably think you can spot a fake. After all, you see weird AI-generated videos of celebrities on social media and they give you that uncanny valley tingle. But it’s a different ballgame when all you’ve got to go on is a voice. 

In real life, people only catch voice clones about 50% of the time. You might as well flip a coin.

And that makes us extremely vulnerable to attacks.

In the “classic” voice clone scam, the caller is after an immediate payout (“Hi it’s me, your boss. Wire a bunch of company money to this account ASAP”). Then there are the more complex social engineering attacks, where a phone call is just the entryway to break into a company’s systems and steal data or plant malware (that’s what happened in the MGM attack, albeit without the use of AI).

As more and more hackers use voice cloning in social engineering attacks, deepfakes are becoming such a hot-button issue that it’s hard to tell the fear-mongering (for instance, it definitely takes more than three seconds of audio to clone a voice) from the actual risk.

To disentangle the true risks from the exaggerations, we need to answer some basic questions:

  1. How hard is it to deepfake someone’s voice? 
  2. How do hackers use voice clones to attack companies?
  3. And how do we guard ourselves against this… attack of the clones?

Like a lot of modern technologies, deepfake attacks actually exploit some deep-seated fears. Fears like, “your boss is mad at you.” These anxieties have been used by social engineers since the dawn of the scam, and voice clones add a shiny new boost to their tactics. 

But the good news is that we can be trained to look past those fears and recognize a suspicious phone call–even if the voice sounds just like someone we trust.  

If you want to learn more about our findings, read our piece on the Kolide blog. It’s a frank and thorough exploration of what we should be worried about when it comes to audio deepfakes.**

Our thanks to Kolide for sponsoring MacStories this week.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK