top of page

Don’t get caught out by “deepfake” scams.


deepfake scam

Artificial Intelligence (AI) is continually developing and whilst there are some very positive uses for this, there is also a growing threat from the use of AI by cyber criminals.


One such threat is from deepfake scams. “Deepfake” refers to manipulated images, videos, and audio which deliver convincing but false representations of people and events.


Of particular concern is voice spoofing, also known as voice cloning, which uses AI to create a realistic-sounding recording of someone’s voice. Examples have included simulated voices of relatives or bank officials, which have tricked victims into parting with money or sensitive information.


How do you protect yourself against deepfake scams?


Firstly, be aware of the typical features of such scams, such as:

  • Unsolicited calls saying a loved one is in danger and needs money.

  • Messages asking for personal information, particularly if they involve financial transactions.

  • Calls where the voice sounds distorted and / or there are long pauses. Deepfake scams require the criminal to type out sentences that are then converted to an audio message. Therefore, the speed of communication is unlikely to be the same as normal speech.

  • Unusual speech patterns or unfamiliar accents.

  • Requests which seem out of character for the person or organisation who appears to be contacting you.

  • The caller uses emotional or high-pressure tactics.

Secondly, take steps to verify the authenticity, such as asking the caller to confirm their identity by answering questions that only the real person would know, and / or calling them back on a known number.


Never make a payment to someone unless you are absolutely certain of their identity and the genuineness of bank account details.


Article kindly provided by HJS Technology, IT support and services


Featured Posts
Recent Posts
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page