Skip to content

AI Imposter Scams

The faces and voices of people you know can be exquisitely faked as part of an effort to steal your money or identity.

It is up to us to protect ourselves against AI Imposter scams.

AI tools can be befittingly useful for many reasons.

However, they also can be easily weaponized by fraudsters to create realistic yet bogus voices, websites, videos and other content to perpetrate fraud.

Many fear the worst is yet to come.

Scammers use AI to lure investors and customers into scams.

Individuals with original information about AI Imposter scams can anonymously blow the whistle.


An example of how an AI imposter scam has caused a multinational firm to lose millions.

A UK engineering multinational firm (Arup) has fallen victim of an AI imposter scam after a finance employee was duped into sending 200 million Hong Kong dollars to fraudsters by an artificial intelligence-generated email and video call.

Ai imposter scams - whistleblowing & risk solutions

The worker had grown suspicious after he received an email that was supposedly from the Arup’s UK-based chief financial officer.

Initially, the worker suspected it was a phishing email, as it talked of the need for a secret transaction to be carried out.

The worker then put aside his doubts after attending a video call with what he thought were several other co-workers, but all of whom were in fact AI imposters.

However, this incident could have been avoided if Arup had a whistleblowing platform.

Therefore, the worker could have reported his suspicions about the phishing email.


How HX5 encrypted can protect your organisation against AI imposter scams.