Give people new tech and they’ll find a way to be creepy with it.
Case in point: A new House bill — the No AI Fake Replicas and Unauthorized Duplications (FRAUD) Act — points to several instances of AI nonsense, including:
- Bogus songs replicating artists like Drake and Bad Bunny.
- Fake advertisements featuring celebrities like Tom Hanks.
- New Jersey high school students who made AI-generated porn of their underage classmates.
- A Department of Homeland Security report that found 100k+ AI images depicting women nude.
And did these people consent to having their likeness used by AI in those ways? Nope.
Thus, the bill…
… would make it illegal to create a “digital depiction” of any person, living or dead, without permission. This includes both their appearance and voice.
Violators would be subject to fines of up to $50k per violation, or damages.
Two similar bills, per Vice:
- The Senate’s Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, which protects performers.
- Tennessee Gov. Bill Lee’s Ensuring Likeness Voice and Image Security (ELVIS) Act, which protects voices.
Why they matter
Already, scammers are using AI to mimic the voices of people’s loved ones to trick them into sending money or providing personal info.
And a casual glance at history reveals that laws need to keep pace with technology to keep up with people’s bullshit:
- In 1999, Gary Dellapenta was the first person charged under California’s cyberstalking law after posting online ads that attempted to trick men into assaulting a woman who rejected him.
- California also led the way on revenge porn, or posting sexual content online for malicious purposes. In 2014, a man was sentenced to a year in jail for posting a topless photo of his ex on Facebook.
But…
… attorney Carrie Goldberg, who specializes in internet harassment, told Vice that the fight needs to happen on a systemic level, also leveling blame at the companies making deepfake products.
“We need to pressure search engines to not guide people to these products or promote sites that publish these images and we need to require that they make content removal simple for victims,” she said.