Home Artificial intelligence One in four Americans targeted by AI voice cloning scams, expert warns
Artificial intelligence

One in four Americans targeted by AI voice cloning scams, expert warns

Share


Warning Signs to Watch For

Americans are facing a sharp rise in AI-powered voice cloning scams, with one in four adults either experiencing the fraud firsthand or knowing someone who has, according to a study by McAfee.

The schemes typically involve criminals using artificial intelligence to mimic the voice of a loved one in distress — often claiming to be in hospital, under arrest or involved in an accident — and urgently requesting money.

The financial impact is expected to escalate dramatically. The Deloitte Center for Financial Services projects that fraud losses facilitated by generative AI could surge from $12.3 billion in 2023 to $40 billion by 2027.

James Grifo, Owner and CEO of Audio Visual Nation, says the accessibility of the technology is what makes the threat particularly alarming.

“What makes voice cloning particularly predatory is how accessible the technology has become,” Grifo said. “Scammers can now create a convincing clone of your voice using just three seconds of audio pulled from a social media video or voicemail.”

According to Grifo, fraudsters no longer need sophisticated equipment or deep technical expertise. Publicly available AI tools can generate realistic voice replicas in minutes using short clips posted online.

Despite advances in AI, experts say cloned voices often contain subtle flaws.

Flat or emotionless tone
AI can struggle to replicate the full emotional range of human speech. Grifo explains that genuine distress typically includes natural variations in pitch, breathlessness or trembling, whereas cloned voices may sound oddly monotone.

Unnatural pauses and rhythm
Speech patterns are highly individual. AI-generated calls may include awkward pauses mid-sentence or unusual pacing that doesn’t match how the person normally speaks.

“The rhythm of speech is incredibly personal,” Grifo said. “AI hasn’t perfected this yet.”

High-pressure tactics
Scammers frequently create urgent scenarios to override rational thinking, insisting money must be transferred immediately to resolve an emergency.

Unknown numbers
Calls often originate from unfamiliar or spoofed numbers, with scammers claiming the victim’s loved one has lost or damaged their phone.

“Always question why your daughter would be calling from a random number instead of her own phone,” Grifo said. “That inconsistency is your first clue.”

How to Protect Yourself

Security experts recommend proactive steps to reduce the risk of falling victim:

Establish a family codeword
Creating a secret phrase known only to close relatives can provide a simple but effective safeguard. A scammer may be able to clone a voice, but they won’t know the agreed codeword.

Verify independently
If a caller claims to be in trouble, hang up and call back using the person’s usual number. Alternatively, contact another trusted family member to confirm their whereabouts.

Ask personal questions
Request details about shared memories or recent events that only the real person would know.

“AI can clone a voice, but it can’t clone memories,” Grifo said.

Pause before acting
Experts stress that legitimate emergencies rarely require instant wire transfers or password sharing. Taking even a few minutes to verify the situation can prevent significant financial and emotional damage.

Grifo warns that the combination of emotional manipulation and cheap, widely available technology has made voice cloning scams especially dangerous.

“Make scepticism your default response to unexpected calls requesting money or sensitive information, even if the voice sounds exactly like someone you know,” he said. “Real emergencies can wait the two minutes it takes to verify who you’re actually speaking with.”

With generative AI tools becoming more sophisticated and widespread, consumer vigilance may be the strongest defence against a rapidly growing form of digital fraud.



Source link

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *