AI voice impersonation scams
How the scam works
AI voice impersonation scams are a new version of 2022’s ‘Hi Mum’ scam. While Hi Mum was text message-based, these scams involve a person receiving a phone call from a “loved one” in “distress”. The loved one might claim they’ve been beaten up or kidnapped and won’t be freed unless the person sends money.
NAB Manager Advisory Awareness Laura Hartley said the scams could be created with as little as three seconds of audio taken from a social media profile, voicemail or video on a website.
“While we haven’t had any reports of our customers being impacted by AI voice scams to date, we know they are happening in the UK and US, in particular, and anticipate it’s just a matter of time before these scams head down under,” Ms Hartley said.
“These scams use readily available technology, yet still require criminals to find a link between the person receiving the phone call and the person in ‘distress’ so they’re harder to scale than other scams.”
Red flags to look for
- Unexpected phone calls from a “loved one” in “distress”.
- Urgency asking you to make a payment.
- Requests for secrecy from the “loved one” not to tell anyone else what’s happened.
How to protect yourself
- If it’s someone you “know” is calling you asking for a payment, and you aren’t sure it’s legitimately them, disconnect the call, and ring them directly before sending money.
- Review your social media profiles regularly. Take the time to see if your profile is locked. Go through your friends and connections.
- Set up multi-factor authentication on social media accounts to reduce risk of ‘takeovers’.
- Change how you think about consent forms. Take the time to be clear on what you’re signing up to and if it includes using photos or video of you or your child on websites, social media platforms or similar.