This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read
Reposted from Taylor English Insights

Voter Groups Sue Companies Behind "Deepfake" Audio Robocalls

Certain New Hampshire voters got robocalls during primary season urging them to “save their vote” for the general election in November by not voting in the state's Democratic primary. The recorded message told voters they could only vote in one or the other of the two electoral events. The calls appeared to come from the phone number of a long-time state Democratic operative, and seemed to feature a recorded message in the voice of Joe Biden. In reality, though, the number and the voice were AI fakes. Now, several individual voters and some voting rights groups are suing the companies behind the calls, saying the campaign used AI to suppress turnout in the Democratic primary and violated several federal laws including the one applicable to robocalls.

Why It Matters

Emerging regulations in the EU and around the world require that companies using AI disclose its use and tell recipients when content has been manipulated to produce a fake/simulation/spoof. There is no direct regulation of AI nationally in the US yet, but there are other federal laws (such as those applicable to cell phone spam and robocalls) that may be helpful to plaintiffs trying to quash such efforts. (Some states have also passed laws governing the use of AI in election ads, although New Hampshire does not yet have such a law.)  This election season is likely to pose several interesting tests of how AI can be used widely in different ways, from fake content like this to algorithms that determine who is targeted with which ads/solicitations, and more. In the meantime, companies using AI to generate promotional material and calls for commercial purposes should be aware of the emerging effort to police such materials, and tread lightly.  

Subscribe to Taylor English Insights by topic here.

The New Hampshire robocalls served as one of the first concrete examples of deepfake audio being deployed to potentially suppress voting in a U.S. election. Both New Hampshire and federal authorities are investigating the incident, and the FCC unanimously ruled in February that AI-enabled robocalls like the one carried out in New Hampshire are illegal under the Telephone Consumer Protection Act.

Tags

data security and privacy, hill_mitzi, insights