Advertisement

Hackers have new weapon

Some of the most effective cyber-criminals are not expert computer hackers but “human hackers” – people who are expert at manipulating others to gain access to information or financial systems.

They now have a new weapon in their armoury: deep-fake synthetic voice and video that enables them to imitate clients or colleagues. Australian law firms have been successfully targeted and overseas businesses have been taken for millions.

Recent examples using deep-fake recorded messages:

  • A voice message “confirms” the bank account sent previously via email is the correct destination for a large payment;
  • A staff member gets a phone message instruction from the firm’s Principal to transfer money; and
  • A paralegal’s voice is cloned and this is used to convince clients to transfer funds to the attacker.

A common feature of these attacks is their simplicity: the criminal did not need access to your systems to carry them out, just the phone number of the target. In the near future (perhaps by the end of 2024, if not earlier), attackers will not be restricted to recordings but will be able fake live conversations in real time:

  • You ring to confirm email instructions for funds transfer and leave a message. A few minutes later the “client” rings back and confirms the bank account number during a live phone call. The person you were speaking to is not the client but the attacker using voice-cloning software.

The basic format of the fraud is not new, as social engineering attacks like this using email have been around for years.

We all know not to act on email instructions to transfer money (either from within or outside the firm).

Advertisement

Unfortunately, as the threats evolve so must our defences.

We can no longer rely on voice messages as confirmation, and soon that suspicion will have to extend to inbound real-time voice and video calls as well.

What is a deep fake?

A deep fake is a synthetic voice, image or video of a real person.

An attacker who has a 20-30 second voice recording (a decent length phone message is enough) can capture the target’s voice print and use it to generate a synthetic message with any content they want.

The quality is so good that even close friends and colleagues can’t always tell the difference. Two years ago this technology was complicated and expensive. Now it only takes a few minutes and a few dollars to generate a convincing fake.

Advertisement

For the moment, the technology is not quite good enough to deep-fake voice or video in real time, so if you are having an actual conversation with someone you can be reasonably satisfied that it is them.

However, technology that will allow real-time voice cloning for interactive conversations is now being trialled and may be available to the criminal market shortly.

Some criminal groups are extremely well resourced and maintain their own IT research divisions.

Even with limitations in currently available technology there are examples of the synthetic content being woven into a real discussion to create the illusion of interactivity.

In a recent case from Hong Kong a finance company middle manager was tricked into transferring USD $25 Million during a Teams meeting, supposedly on instructions from his CEO and senior company executives.

Another weapon in the crook’s armoury is the fact that mobile numbers can be “spoofed”, so it looks like the call or message originates from a familiar mobile. In reality, that could be any phone in the world.

Advertisement

How to disrupt these attacks

Do:

  • Ensure all funds verification calls are conducted in real time by trained staff.
  • Only rely on outbound, not inbound calls or instructions for funds transfer.1
  • Make sure all staff know about these types of scams.
  • Protect the firm’s money as well as the clients. Queensland lawyers have been tricked into paying large bills and transferring their own money using fake instructions from the boss to accounts staff.
  • Make sure clients are warned not to act on email instructions or recorded verification calls.
  • Protect the integrity of the information in your system, both electronically and through staff training. If a helpful receptionist can be manipulated into changing the phone number in your records any outbound verification call can be directed to the attacker.
  • Map out all the major funds transfers your firm makes and think about how this technology could be used to misdirect them.
  • Follow the Lexon funds transfer protocols, and note that these may change in response to evolving threats.

Do Not:

  • Accept voice messages as confirmation even if they seem to be in response to an earlier call.
  • Accept inbound calls to verify destination accounts or funds transfer instructions. Ask them to hang up so you can call back using the number from your file. If they tell you that they need you to call another number, be very careful.
  • Assume that because the incoming call number matches the colleague / boss / client all is well.

Footnotes
1 For the moment this may be overly cautious, however there is no way to predict when interactive conversation cloning will become available to criminals.

Share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Search by keyword