Most major artificial intelligence voice cloning programs have no meaningful barriers to prevent people from blaming others, consumers reported a survey.
Voice Cloning AI technology has made incredible advances in recent years, with many services able to effectively mimic a person’s rhythm with sample audio in seconds. A fake point moment occurred in last year’s Democratic primary. That’s when the fake Joe Biden robocall spewed a call from a voter telling him not to vote. The political consultant who recognized the plot to mastermind the scheme was fined $6 million, and the Federal Communications Commission bans robocalls generated by AI.
A new publicly published research from six AI Voice Cloning Tools shows that five have easy-to-pass safeguards, allowing people to easily clone a voice without consent. Deepfake Audio detection software has a hard time communicating the difference between actual and synthetic voices.
Generated AI, which mimics human qualities such as appearance, writing, and voice, is a new and rapidly evolving technology, with few federal regulations in the industry. Most ethical and safety checks across the industry are self-imposed. Biden included some safety needs in his executive order on AI that he signed in 2023, but President Donald Trump rescinded the order when he took office.
Voice cloning technology works by taking audio samples from a speaker and extrapolating the person’s voice into a synthetic audio file. If your safeguard is not in place, anyone who registers an account can simply upload personal audio such as Tiktok or YouTube videos and mimic them into the service.
Four services, ElevenLabs, Speedify, Playht and Lovo, must check the box that says that the voice has been cloned and granted permission.
Another AI-like service requires people to not only upload recordings, but also record audio in real time. However, consumer reports were able to easily bypass that limitation by simply playing audio recordings from the computer.
Only the sixth service, the statement, received somewhat effective protection. Cloners are required to record specific consent statements. This is difficult to forge except cloning another service.
All six services are made publicly available via the website. To create a custom audio clone, the AI-like AI-like costs are $5 and $1 respectively. The rest is free.
Some companies argue that abuse of tools can have serious negative consequences.
“We have recognized the possibility of misuse of this powerful tool and implemented robust protective measures to prevent the creation of deepfakes and protect against voice spoofing,” an AI-like spokesman told NBC News via email.
There are legitimate uses for AI voice cloning, such as helping people with disabilities or creating voice translations for people speaking in different languages. But Sarah Myers West, co-executive director of AI Now Institute, a think tank that focuses on the outcomes of AI policy, said there is also a great potential for harm.
“This can obviously be used for fraud, fraud, misinformation, for example, as a person in the institution,” West told NBC News.
There is little research into the range of frequency where AI is used in audio-based fraud. In the so-called grandparent fraud, criminals call people who claim an emergency involving family members, such as an invitation, arrest or injured. The Federal Trade Commission warns that such fraud could use AI, but fraud is pre-technology.
The cloned voices have been used to create music without the permission of the artists portrayed, as happened in the Viral 2023 song that appears to be false by Drake and Weekend, and when others released the music in their own voice, they struggled to control their image.
This story first appeared on nbcnews.com. More from NBC News:
Source link