Get ready for Zoom-based deepfake phishing attacks, experts warn

Get ready for Zoom-based deepfake phishing attacks, experts warn

Matthew Canham, a research professor and cybersecurity consultant at the University of Central Florida, said at the Black Hat security conference last week that deepfake attacks will become more sophisticated and harder to detect

Canham added that phishing attacks using deep fakes of Zoom and other video conferencing apps (he calls them "zishing") and deep faking attacks using biometric media such as fingerprints and facial recognition may soon appear

"I have been working on this for a long time

"My friend Dolores received a series of text messages from her boss telling her to buy gift cards for 17 employees for an upcoming holiday party and not to tell anyone

Dolores was the target of a deepfake attack using text messages An automated script or "bot" first contacted her and "spoofed" her boss's cell phone number to impersonate him

The bot exchanged several messages with Dolores to establish trust, then a human on the other end took her place and explained the rest of the scam

A well-known attack in the UK a few years ago, Canham says, was by telephone A computer-generated voice application, or a skilled human impersonator, imitated the voice of the boss, calling the company posing as the CEO and ordering the money to be transferred to a specific account

After this was repeated a couple of times, the company became suspicious and asked the "boss" to identify himself

Canham calls these attacks "synthetic media" attacks He classified the attacks according to five factors: medium (text, voice, video, or a combination thereof), control (performed by a human, bot, or both), familiarity (how well the target "knows" the fake person), interactivity (whether the communication is slow, fast, or instant), and target (a specific individual, or anyone), we came up with a classification framework that measures five factors

Canham cites the example of a virtual kidnapping scam that occurred in Indiana A family member thought he received a phone call from a scammer who claimed to have kidnapped his family and demanded ransom One man even received such a call about his daughter, even though his own son had received a ransom call from someone pretending to be the father

The only "proof" was that the call appeared to be coming from a loved one However, it is not difficult to "spoof" a phone number

There will be more scams using video, Canham says We have already seen a deep-fake video done by comedian and filmmaker Jordan Peele in which former President Barack Obama comments on the movie "Black Panther" and appears to insult then-President Donald Trump

In this case, Peele imitated Obama's voice himself and then used a deepfake program to alter the existing Obama video so that his mouth movements matched his words

Even more alarming, though perhaps not as obvious, is the "I am not a cat" zoom video in which a Texas lawyer in 2020 gets stuck in a kitten avatar during a trial, Canham said

In this case, the kitten avatar perfectly matched the Texas attorney's mouth and eye movements in real time It may not be long before similar overlays and avatars make videoconference participants convincingly look like completely different people

"I think in a few years, we will soon see phishing attacks using Zoom," says Canham

After that, he said, the next frontier will be biometric-based phishing attacks, although they may involve "Mission Impossible"-style physical creation

"One could argue that 3D printed fingerprints qualify," Canham said

But there could also be a digital component A few years ago, German researchers showed that a high-resolution photo of Chancellor Angela Merkel's eyes might be enough to fool iris scanners, and that an equally precise photo of another German politician's raised hand could create a convincing fake fingerprint

Canham says a surprisingly low-tech solution may be effective in stopping deep-fake attacks before they go too far He said he once heard of a company boss telling an employee never to ask for a gift card to buy one, because "it's a very good idea

In other instances, an authorized person might need a pre-shared PIN to transfer a large sum of money, or the approval of several people

He also suggested fighting bots with bots, so to speak According to Carnham, there is already the Jolly Roger Telephone Project, a computer program designed to draw telemarketers into pointless conversations and waste their time Perhaps the best defense against deepfakes is another deepfake

Slides from Canham's black hat presentation can be found here

Categories