Cases of explicit AI-generated child deepfakes have increased rapidly in recent years, prompting lawsuits in several states to enact laws to protect against them.
Lawmakers in more than a dozen states have passed a flurry of bills to ensure that local prosecutors can bring charges under state law over AI-generated “deepfakes” and other sexually explicit images of children.
A deepfake is a video, photo, or audio recording that looks real but is manipulated by artificial intelligence. Deepfakes can make it appear as if someone is saying or doing something that they did not actually say or do.
According to the National Conference of State Legislatures, most of these laws target sexually explicit or pornographic video images, and some expand on existing nonconsensual intimate image laws. .
AI-generated TikTok and YouTube videos push websites that turn photos into AI nudes
States with laws to protect children from deepfakes
Governors in more than a dozen states have signed laws this year cracking down on digitally created or altered images of child sexual abuse, according to research by the National Center for Missing and Exploited Children.
According to an analysis by MultiState Associates shared with NewsNation, 14 states have laws in place with specific references to children to protect them from deepfakes and other AI-generated content.
These include Utah, Idaho, Georgia, Oklahoma, and Tennessee.
Five other states have laws that will take effect by early 2025.
In September, California closed a loophole in its law regarding AI-generated child sexual abuse images, making it clear that child pornography is illegal even if it is generated by AI.
Previous law did not allow district attorneys to investigate people who possessed or distributed AI-generated images of child sexual abuse unless they could prove that the materials depicted real people. However, under the new law, such crimes will be considered felonies.
Teenage AI nudes spread. She’s fighting deepfake porn now
South Dakota updated its law against child sexual abuse images in July to include images created by artificial intelligence. The law requires minimum prison sentences of one, five, and 10 years for first-time offenses of possession, distribution, and manufacturing, respectively.
There is currently no federal law addressing non-consensual deepfake pornography, but legislation has been proposed to address the issue for adults.
The Explicitly False Images and Nonconsensual Editing Act would allow victims of deepfake pornography to sue as long as they can prove the deepfakes were created without their consent. It will be done.
The Take It Down Act would require platforms to remove both revenge porn and non-consensual deepfake porn.
But Justice Department officials say they already have tools under federal law to go after the perpetrators of these images.
A federal law signed in 2003 prohibits the production of visual depictions, including pictures of children engaged in sexually explicit conduct, that are considered “obscene.” The Justice Department has used the law to prosecute cartoons of child sexual abuse, noting that there is no requirement that “the minor depicted actually exists.”
Will deepfake laws work to protect children?
Justin Patchin, a professor and co-director of criminal justice at the University of Wisconsin-Eau Claire, said the law is an important tool for criminal prosecution, but the act of creating deepfakes, especially when other students are creating them, says He said it would be impossible to suppress it. Cyberbullying Research Center.
“Teens are not deterred by the threat of formal punishment; they are further deterred by informal punishments, such as what their friends will think, what their parents will do, what their teachers will think about them. ” he says.
He added that the law is a “necessary but not sufficient response” to non-consensual and blatant deepfakes.
While advances in technology have outpaced laws and will likely continue to do so, many argue that laws are needed to help law enforcement and prosecutors pursue perpetrators.
“We must communicate early and often that this is a crime and that it will be investigated and prosecuted if the evidence supports it,” said Stephen Grocki, head of the Justice Department’s child exploitation and obscenity division. ” he said in an interview with The Paper. Associated Press. “If you’re sitting there thinking otherwise, you’re fundamentally wrong, and it’s only a matter of time before someone holds you accountable.”
“These laws exist. They will be used. We have the will. We have the resources,” Grocchi also said.
As California’s law becomes more expansive, Ventura County District Attorney Eric Nasarenko said it resulted in eight lawsuits related to AI-generated content between December of last year and mid-September. This opens the door for the office to prosecute.
AI nude photo scandal calls for more oversight of technology
Patchin said it is more important to focus on education and awareness about the dangers of deepfakes, both in schools and parents.
The Associated Press contributed to this article.
Source link