A hand touches the trackpad of a laptop. Editing software is on the screen.

How to spot and avoid AI scams

Helen Fox

By Helen Fox

Artificial intelligence (AI) is now easily available and often free.

But this means fraudsters are seizing the opportunity to use AI to scam people. This includes creating credible text messages, emails and social media posts.

Underneath this, though, the scams at work are nothing new. The technology available to fraudsters simply helps them make their scams more convincing.

We're uncovering how scammers are using AI to trick us. Plus, we've got tips from our in-house fraud expert on how to identify and avoid scams that have been made with AI.

There are a few ways AI is used in scams

Fraudsters typically use generative AI tools - tools that generate text, images, sound, and video in response to prompts - to create scam content. They do this in a few ways.

Deepfake scams

A deepfake is when a fraudster makes themselves appear to be someone else. They use AI tools to digitally alter their appearance and voice. The results can be very convincing.

Deepfakes can make it seem like a well-known personality or public figure is saying something they aren't. This can range from politicians expressing views that could damage their reputation to celebrities endorsing or promoting something that's really a scam. Here are a few examples of how deepfakes have been used in scams:

  • In July 2023, consumer finance expert Martin Lewis featured in a deepfake video promoting dodgy investment opportunities.
  • In October 2023, YouTuber MrBeast appeared in a deepfake scam on TikTok offering the new iPhone 15 for just $2.
  • In February 2024, a worker at a multinational firm was tricked into paying around £20 million to fraudsters after being asked to by a deepfake of the company's Chief Financial Officer during a video call.

Using a familiar face and voice means people are more likely to trust what they see and fall for the scam.

Voice-cloning scams

Voice cloning is when a person’s voice is recorded and then replicated by AI to say anything that the fraudsters want. When it’s used in scams, voice cloning can be distressing and difficult to resist.

In a voice-cloning scam, you receive a call that appears to be from someone close to you, like a child, a sibling, a parent, or a good friend. The caller will be very upset and panicked. They may say they’ve been abducted or are in another emergency situation.

Eventually, they’ll reveal that they need you to send them money to help them out of the trouble they’re in. Because you believe this cloned voice is really them, you send the money however they ask for it. But your loved one was never in any danger, and the money never reaches them. Instead, it gets stolen by a scammer.

Phishing messages

AI is also increasingly being used in phishing scams. In these scams, fraudsters message you, pretending to be a person or company you trust, like HMRC or your bank. They'll invent a reason you need to click a link in the message, where you'll be asked to provide personal or financial information that can then be used to commit fraud.

Phishing messages are often characterised by bad spelling and sentences that don't make sense. But with AI, fraudsters can craft well-written, convincing messages that make scams much harder to spot.

How to avoid AI scams

We spoke to our in-house fraud expert, Ben Fleming, about the best ways to avoid an AI scam. Here are Ben's top tips:

If it sounds too good to be true, it probably is

This is the golden rule of scam-spotting. Whether it's an email from HMRC promising you a massive refund or a celebrity promoting a lucrative investment opportunity, if it seems too good to be true, chances are it's a scam.

Set a safe word

Because voice-cloning scams can be so alarming, it's a good idea to set a "safe word" with your family and friends. This word should be something that's difficult for a stranger to guess but is easy for you to remember. For example, you could use a word that's linked to a memory you share. Then, if you get a call from a loved one in trouble, you can ask them for your safe word to be sure it's really them.

Remember, underneath the AI, these are the same old scams

Even though AI gives fraudsters new ways to make their scams more convincing, under the surface, they're still using the same tactics. A voice cloning scam is only a more advanced (and upsetting) version of a fraudulent email from a 'friend' claiming they're stranded abroad and need money to get home. So, all the same scam-spotting tricks apply:

  • Be wary of messages with attachments. Most legitimate companies don't include attachments with their messages. So, if you get a message that does, it could be a fraudster trying to get you to download software they can use to spy on you and steal your information.
  • Hover over links to check before you click. Genuine organisations send links in their emails, but so do scammers. You can check if a link goes to a company's official website by hovering over it before you click.
  • Check the email addresses and phone numbers that messages come from. Scammers can spoof these to make them appear legitimate. But you can still check the sender details to find out if a message has come from a brand's official contact details.
  • Watch out for unusual language. Artificial intelligence helps scammers get around the bad spelling and grammar issues that have given them away in the past. But it's still not perfect. Watch out for messages that include needlessly complicated or descriptive language. They may have been written by an AI tool.
  • Think twice if you're asked to do something quickly. One dead giveaway in a scam is the pressure to act immediately. So, tread carefully and cut communication with anyone asking you to do something urgently, or who gets aggressive or upset when you don't follow their instructions as quickly as they would like.

What to do if you spot an AI scam

If you come across an AI scam, it's important that you report it as soon as you can.

  • If you see a scam on social media, report it according to the platform's reporting process.
  • Report scam texts to your network provider by forwarding them to 7726.
  • Report phishing emails by forwarding them to [email protected].
  • Send the details of suspicious websites to the National Cyber Security Centre.

And if you think you've been the victim of an AI scam:

  • Tell your bank straight away, especially if you've sent any money to the fraudster.
  • Report it to your local police by calling 101.
  • If you believe that you or anyone else is in immediate danger, call 999.

Sources

https://www.which.co.uk/consumer-rights/advice/how-to-spot-and-avoid-ai-scams-aaDn52A8nDhR

https://www.bbc.co.uk/news/uk-66130785

https://www.bbc.co.uk/news/technology-66993651

https://www.theguardian.com/technology/article/2024/may/17/uk-engineering-arup-deepfake-scam-hong-kong-ai-video

Disclaimer: We make every effort to ensure that content is correct at the time of publication. Please note that information published on this website does not constitute financial advice, and we aren’t responsible for the content of any external sites.

Helen Fox

Helen Fox

Personal Finance Editor

Helen is a personal finance editor who’s spent 11 years (and counting!) in the finance industry. She creates content on everything money with the goal of getting people thinking – and talking – about their finances in ways they may not have done before.

A hand touches the trackpad of a laptop. Editing software is on the screen. A hand touches the trackpad of a laptop. Editing software is on the screen.