(NEXSTAR) – Forget about the complexity of your online password, your own voice could be the tool criminals use to access your money, new research warns.
Starling, a UK-based online bank, said Wednesday in a news release that “millions” of people are at risk of falling victim to voice cloning scams. The data, based on a representative survey of roughly 3,000 UK adults, found that 28% of people said they had been targeted by an AI voice cloning scam at least once in the past year. Forty-six percent of UK adults were not aware such scams even existed, Starling’s researchers found.
Powerful artificial intelligence-based programs are able to replicate a person’s voice using just seconds of original audio, which the study’s authors noted is not difficult to find thanks to social media accounts, YouTube and other internet archives. In March, OpenAI unveiled its Voice Engine, which can accurately replicate someone’s speech using just a 15-second clip.
The FBI issued a similar warning in early May, saying that the AI tools are publicly available and allow crooks to increase the speed, scale and automation of their attacks. While incorrect grammar or misspellings might have been obvious clues in the past, the AI tools can polish convincing messages to victims.
“Malicious actors increasingly employ AI-powered voice and video cloning techniques to impersonate trusted individuals, such as family members, co-workers, or business partners,” the FBI said in a release. “By manipulating and creating audio and visual content with unprecedented realism, these adversaries seek to deceive unsuspecting victims into divulging sensitive information or authorizing fraudulent transactions.”
“If a call sounds like your boss (asking for bank account numbers) or your family member (begging for help in an emergency), you’re more likely to act,” a Federal Trace Commission warning reads. “That’s why scammers use voice cloning to make their requests for money or information more believable.”
Major banks such as HSBC, Schwab and others have implemented a Voice ID security screening to speed up customer service interactions. Banks often reassure customers that their voices are as unique as their fingerprints, but what happens if one’s voice can be accurately recreated by a bad actor?
Nexstar reached out to both HSBC and Schwab to ask about the security of their Voice ID features, but did not receive a reply as of publishing time.
In a piece for Vice, a journalist documented himself using an AI clone of his own voice to fool the automated system at Lloyds Bank in the UK, using only the recording and his birth date to gain access to the account.
The article was one of several cited in a US Senate Committee on Banking, Housing and Urban Affairs letter to Brian Moynihan, the CEO of Bank of America, asking for information on how banks use – and protect – voice data.
When it comes to the safety of voice authentication technology, Wells Fargo recently defended the tool, telling KQED that it relies on a “layered approach” that doesn’t grant access solely on the basis of a customer’s voice. “This service must be paired with other identity verification methods to allow access to customer accounts.”
The FBI and Starling recommend people do the following to keep their information – and money – safe:
- Stay vigilant: Messages asking for money or credentials are red flags. Businesses should employee technical solutions to fight phishing tactics and make sure to train employees against them.
- Multi-factor authentication: This is a crucial tool against crooks trying to take control of an account or system.
- Consider a “safe phrase”: While AI tools can mimic your voice, it won’t be able to guess what phrase you might have previously agreed upon with friends and relatives.
If you do experience a scam attempt, the FTC asks you to send them an email.