In January, a robocall impersonated President Joe Biden and told Democrats not to vote in the New Hampshire primary. Now the Federal Communications Commission has ruled that calls made with artificial intelligence-generated voices are illegal, giving states a new way to go after the people who create such calls. The ruling takes effect immediately.
The fake Biden robocalls originated with a Texas company, the New Hampshire attorney general said on Feb. 6, opening a criminal investigation.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities and misinform voters. We’re putting the fraudsters behind these robocalls on notice,” said FCC Chairwoman Jessica Rosenworcel in a statement. “State Attorneys General will now have new tools to crack down on these scams and ensure the public is protected from fraud and misinformation.”Â
Since the 2022 release of ChatGPT, a chatbot from OpenAI, the capabilities of artificial intelligence have been making headlines. The ability to copy the voices of celebrities, politicians and others have drawn special attention, especially because of the many ways such familiar voices can be misused. A TikTok user called ghostwriter stirred controversy in 2023Â for a song called Heart on My Sleeve, which used AI-created vocals to imitate musicians Drake and The Weeknd. And an AI-created voice and likeness of singer Taylor Swift made it seem that the Grammy Award-winning musician was endorsing Le Creuset cookware, though neither Le Creuset or the real Swift were involved.Â
State attorneys generals could already target the outcome of a scam call using AI voices, but now the mere act of using AI to generate unlicensed voices is illegal. The FCC statement says that this ruling should provide states with more legal backing to pursue cases against fraudsters.Â
AI could help stop robocalls from getting through
The FCC has been working on this for months. In November, leaning on…
Read the full article here