...
1300 669 711 - Australia

AI Voice Scams in Australia

random user

Cybertrace Team

June 23, 2024 · 8 min read

Share On

AI Voice Scams Are Becoming Much More Common

AI voice scams in Australia are more prevalent than ever, with mums and even elderly people falling victim, resulting in a staggering loss of $568 million. Have you been a victim of these scams? Do you want to fight back against the scammers? While they may be experts in orchestrating these scams, there is still hope in holding them accountable.

What Is An AI Voice Cloning Scam?

Visualisation of an Artificial Intelligence fraud with a wireframe face and a red warning symbol with exclamation mark

An AI voice cloning scam is a relatively new type of fraud where criminals use artificial intelligence to create highly realistic replicas of someone’s voice. These voice clones are generated using advanced generative AI technologies that can mimic the tone, pitch, and cadence of a person’s speech after analysing just a few seconds of their audio.

Scammers typically use these cloned voices to deceive victims by impersonating friends, family members, or even public figures, tricking them into providing sensitive information or making financial transactions under false pretences​.

Additionally, the effectiveness of AI voice cloning scams lies in their ability to create a convincing audio representation of the target’s voice, making it difficult for the victims to distinguish between the real and fake voices. The generated voice replicas can be of very good quality, so many people are being scammed by this technique.

This technology relies on publicly available voice samples, often sourced from social media, videos, or recorded phone calls, which are then used to train the AI models to replicate the voice accurately. As a result, these scams can create a sense of urgency or emotional manipulation, leading to substantial financial and emotional harm for the victims if they do fall for the scam​​.

Finally, these scams are becoming increasingly advanced and widespread, with cases reported worldwide. The ease with which these voice clones can be created poses significant challenges for individuals and organisations in maintaining security and trust in voice communications.

Have you been scammed by an AI Voice Scam? Simply click the button below and one of our Licensed investigators will get back to you asap.

How Many People Have Been Scammed By An AI Voice In Australia?

AI voice cloning scams are becoming a major concern in Australia, mirroring trends observed globally. In 2023 Australians reported an average of 1,500 scam cases per month with a large portion involving some form of impersonation, including AI voice scams. These scams often use advanced AI technology to clone voices using minimal audio samples, making it difficult for victims to discern between genuine and fraudulent calls​​.

Experts predict that AI voice scams will keep increasing in 2024 as scammers use this advanced technology to manipulate unsuspecting victims. Such scams have already been widespread in countries like the UK and the US, and are expected to become more common in Australia. The AI can recreate a voice with high accuracy from a brief recording, enabling scammers to impersonate loved ones or trusted figures quickly and convincingly​.

How Does An AI Voice Scam Work?

A robot with a microphone symbolising Artificial Intelligence copying a humans voice.

The Voice Cloning Process

Scammers use advanced AI tools to analyse and replicate a person’s voice characteristics. This process requires minimal audio data, often as little as three seconds, to create a realistic voice clone. The ease of access to voice data from social media posts, YouTube videos, and other online content makes it very easy for cybercriminals to gather the necessary samples to clone someone’s voice​.

Voice Impersonation

The cloned voice is then used to impersonate someone the victim knows, such as a family member, friend, or even a trusted authority figure. The scammer usually fabricates a distressing scenario, like a car accident or an urgent financial need, to manipulate the victim into sending money quickly. The authenticity of the cloned voice makes these scams particularly convincing and hard to detect​.

Emotional Manipulation

These scams are highly effective due to the emotional manipulation involved. Scammers create a sense of urgency and pressure the victim to act immediately, often preventing them from taking the time to verify the caller’s identity. The impersonated voice’s emotional pleas can lead victims to make hasty and irrational decisions, which can result in significant financial losses​.

How To Avoid An AI Voice Scam

A person on a laptop computer with a transparent screen showing a head with a cog with AI written on it and a yellow warning symbol with an exclamation mark.

To avoid falling victim to AI voice scams, it’s important to implement several protective measures:

Verify Caller Identity

  • Ask for details: Request the caller’s name, position, and contact information, then verify these details through official channels.
  • Personal questions: Ask some questions that only the real person would know the answers to.
  • Call back: Hang up and call the person’s known number directly to confirm the situation.

Use Secure Communication Practices

  • Secret code: Establish a secret code word or phrase with family members that must be used in emergencies.
  • Alternative contact: Contact another person in common to verify the caller’s identity.

Be Very Sceptical Of Unsolicited Calls

  • Use caution: Treat unsolicited calls with high levels of scepticism, especially those asking for urgent action or sensitive information.
  • No immediate payments: Avoid making immediate payments, particularly via unconventional methods like gift cards or wire transfers.

Technology Tools

  • Call blocking: Use call-blocking features on your phone to filter out suspicious numbers.
  • Check Caller ID: Be wary of unfamiliar numbers and avoid trusting caller ID alone, as it can be spoofed.

Preventative Measures

  • Limit voice data: Be very cautious about sharing voice recordings or personal information online.
  • Privacy settings: Regularly review and adjust privacy settings on social media to limit access to your personal data.
  • Voice authentication: Use voice authentication or biometric security where available.

What To Do If You’ve Been Scammed In An AI Voice Scam?

A private investigator sitting at a computer with two screens investigating an AI voice scam

Cease Communication

Immediately stop all communication with the scammer. Do not provide any more information or make any further transactions.

Contact Your Bank

Inform your bank or financial institution about the scam, especially if you have shared any banking details or made payments into accounts provided by the scammer. The bank can help to monitor your accounts for suspicious activity, and possibly reverse any fraudulent transactions​.

Monitor Your Accounts

Keep a close watch on your financial statements and online accounts for any unauthorised activity. Report any suspicious actions to your financial institution immediately.

Change Password

Update passwords for all your online accounts, especially those linked to sensitive information or financial transactions. Use strong, unique passwords and enable two-factor authentication where available.

Report to ScamWatch

Report the incident to ScamWatch, Australia’s official scam reporting site, to help authorities track and combat these scams.

Contact Cybertrace

Reach out to Cybertrace, we specialise in scam investigations. We can assist in tracking the scammers and gathering evidence to support your case.

If you have suffered substantial financial loss or your personal information has been misused, consider consulting with a legal advisor for additional support and potential recourse.

Example Of AI Voice Scams

A woman's head with what looks like pieces of it coming off in chunks and block like shapes, representing an AI (Artificial Intelligence) voice scam

AI voice technology is increasingly used for scams like impersonating political figures to influence elections or mimicking friends’ voices to solicit urgent funds, taking advantage of emotional appeals.

Additionally, scammers employ AI-generated voice clones in bank fraud and business email compromise scams to deceive victims into sharing sensitive information or transferring money, using trust and familiarity.

Political AI Voice Scams

During the 2024 US presidential primaries, robocalls using AI-generated voices impersonated President Joe Biden, urging New Hampshire voters not to participate in the primary election. This scam aimed to manipulate voter behaviour using a highly convincing voice clone of the President.

Impersonating Family Members For Financial Gain

In a case from Ontario, a man was scammed out of $8,000 when he received a call from someone who mimicked his friend’s voice, claiming to be in a serious accident and needing money urgently. The emotional appeal and urgency made it hard for the victim to verify the authenticity of the call before transferring the money​.

Ontario man loses $8K in AI phone scam using friend’s voice

AI Voice Clone Bank Fraud

AI-generated voice clones are used to impersonate bank representatives, convincing victims to share sensitive information such as account numbers and passwords. This type of scam often involves unsolicited calls where the fraudster uses the cloned voice to gain the victim’s trust​.

Lloyds Bank logged into using AI voice 

Business Email Compromise (BEC)

Scammers use AI voice technology to impersonate senior executives or business partners during phone calls or video conferences. The goal is to trick employees into transferring funds or sharing sensitive information. Such scams exploit the authority and familiarity of the cloned voice to bypass normal security protocols

How scammers can use ‘deep voice’ AI technology to trick you

Related topic: Unmasking Deception: Elon Musk, Australian PM Deep Fake Scam

Contact Us Now

If you’ve been a victim of an AI voice scam or need help protecting against them, contact Cybertrace for an expert investigation. We’re here to help you stay safe and secure.

Questions for the Readers

Have you or someone you know been affected by an AI voice scam? How did you handle the situation? Need help? Simply click the button below and one of our Licensed investigators will get back to you asap.

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Post

stacked coins representing an increase in financial value
Learn How to Avoid a Bitcoin Recovery Scam

Learning how to avoid a Bitcoin recovery scam....

Read more
Australian Government scam analyst reviewing Cryptocurrency Investigations and how to get crypto back from a scammer for your recovery.
Scam Losses Are Falling — But Are We…

Australians remain prime targets for scammers due to....

Read more
Image representing the Scam Phone Number Lookup reverse search tool for checking scam risk.
How Do I Check If A Phone Number…

How to Check if a Phone Number is....

Read more

Contact Us

Contact our friendly staff at Cybertrace Australia for a confidential assessment of your case. Speak with the experts.

Email icon Email: [email protected]
Phone Icon International +61 2 9188 7896