We're Expanding Our Content To Include Topics On Personal Development!

A Terrifying AI Voice Cloning Scam That Nearly Succeeded

3 Min Read | January 29, 2024

A Terrifying AI Voice Cloning Scam That Nearly Succeeded Article Image
HappyPath Media Icon

By HappyPath Media

In Episode 5 of The HappyPath Media Show, we covered a story about a Bay Area family who experienced a crazy ordeal involving scammers using AI technology. The mother received a phone call from someone who she believed to be her son, claiming that he was in a car accident and had been arrested.  The phone call was convincingly realistic sounding and resembling the voice of her son who happened to be away at college.

Apparently, the scam played out by the scammers passing the call to a (fake) police officer who informed the mother that her son had been in a car accident involving the injury of a pregnant woman.  According to the fake police office, the son had been arrested.  These details were obviously used to heighten the emotional elements of this scam, which was already a tense and frightening story.

Following the conversation with the fake police officer, the call was passed to a fake public defender.  The mother was informed that her son’s bail was reduced from $50,000 to $15,000 and that she needed to get the money quickly.  She was informed not to tell anyone why she needed the money in order to protect her son’s reputation.

The family was convinced by the urgency and seeming legitimacy of the situation.  The parents withdrew the demanded amount, but the scam unraveled when the fake "public defender" said that he was going to send a courier to collect the money.  This raised some red flags for them.  As suspicions were heightened, the family did the right thing and contacted both the police department and the son directly.  It turns out that neither the police department nor the son were aware of the situation and everyone was safe.

This is a frightening scam where AI voice cloning is used to make the story more believable.  With some basic voice samples, a scammer can now convincingly mimic a person's voice.  This tricks victims into believing that they are talking to the real person.

In this story, the use of the son’s voice not only helped make the scam more believable, but helped heighten the emotional stress of the situation. Oftentimes, scammers will use emotion to get their victims to act quickly without much thought.

Unfortunately, with all this deep fake technology, we are entering a world where you can’t trust anything.  In this story, the family did the right thing.  They ended the conversation with the scammers and directly contacted both the police department and the son.  

Always question emotionally charged communications where someone is asking for sensitive information or money.  Ask yourself, “Did I contact them, or did they contact me?”.  If it’s the latter, it’s best to hang up and dial directly.

This story is very frightening because it brings to light the dangers of AI technology in the wrong hands.  We are all going to have to be more vigilant and proactive verifying our communications to ensure we’re not being scammed or taken advantage of.

Want to join us?Start Herearrow

In our weekly newsletter we'll share life hacks, tips and our own experiences as we strive to live a happier and more fulfilled life.

Privacy PolicyWe will not sell or share your email.