Attorneys General: 'Race Against Time' to Bar AI-Generated Child Sexual Abuse Content

Home  »  Sex Crimes   »   Prosecutors in Every State Call on Congress to Address AI-Generated Child Sexual Exploitation

Prosecutors in Every State Call on Congress to Address AI-Generated Child Sexual Exploitation

Prosecutors in Every State Call on Congress to Address AI-Generated Child Sexual Exploitation

Sep 7, 2023

Prosecutors in all 50 states are urging Congress to take immediate action against artificial intelligence-generated child sexual abuse content.

The Associated Press (AP) reported that attorneys general from across the nation sent a letter to Republican and Democrat leaders Tuesday urging federal lawmakers to “‘establish an expert commission to study the means and methods of AI that can be used to exploit children specifically‘ and expand existing restrictions on child sexual abuse materials specifically to cover AI-generated images.”

Child sexual abuse material (CSAM) can be used to exploit children through pornography, the AP said. Prosecutors are calling on federal officials to both research how artificial intelligence can be used as a means of child exploitation and create legislation to “further guard against it,” the AP reported.

We need to make sure children aren’t harmed as this technology becomes more widespread, and when Congress comes back from recess, we want this request to be one of the first things they see on their desks,” South Carolina Attorney General Alan Wilson said in the letter.

Prosecutors stressed the potential for altered images of children to be generated through artificial intelligence for the purpose of creating pornography.

Exploitation of Children Through Artificial Intelligence

In the letter from the nations attorneys general, serious concerns over the potential for AI-generated CSAM to exploit children were raised. Some of the most urgent matters cited include:

  • Overlaying the face of one person on the body of another
  • Impacting previously unharmed children by depicting materials that swap their faces onto the faces of children who were abused
  • Altering the likeness of a real child from something like a photograph taken from social media (so that it depicts abuse)

We are engaged in a race against time to protect the children of our country from the dangers of AI,” the prosecutors wrote in the letter. “Indeed, the proverbial walls of the city have already been breached. Now is the time to act.”

“Everyone’s focused on everything that divides us,” South Carolina Attorney General Alan Wilson said, who spearheaded the coalition with his counterparts in Mississippi, North Carolina, and Oregon, told the AP. “My hope would be that, no matter how extreme or polar opposites the parties and the people on the spectrum can be, you would think protecting kids from new, innovative and exploitative technologies would be something that even the most diametrically opposite individuals can agree on – and it appears that they have.”

Beyond federal legislation, Wilson told the AP that he’s encouraging his fellow attorneys general to “scour their own state statutes for possible areas of concern.”

The AP noted that “there’s no immediate sign Congress will craft sweeping new AI rules,” but major tech companies have taken steps to combat CSAM/AI-generated sexual abuse content. In February, various companies confirmed their participation in a new tool called Take it Down.

Meta, as well as adult sites such as OnlyFans and Pornhub, began participating in the online tool – developed by the National Center for Missing and Exploited Children (NCMEC) – which allows teens to report explicit images and videos of themselves from the internet. The reporting site works for regular images and AI-generated content, according to the AP.

Increase in Deepfake Pornography Anticipated

In April, Dordulian Law Group posted a blog covering a troubling report from the AP which highlighted a potential increase in AI-generated deepfakes.

Deepfakes are defined as:

Videos and images created digitally or altered using artificial intelligence

The phenomenon is relatively new, with the first known case occurring years ago when content created through AI-generated technology spread across the internet after a Reddit member shared clips of the faces of women celebrities superimposed on the shoulders or porn actors.

The AP report cited specific concerns with AI and deepfakes including:

  • The technology makes it easier to create sophisticated and visually compelling deepfakes.
  • The problem could get worse with the development of generative artificial intelligence tools that are trained on billions of images from the internet and spit out novel content using existing data.

The reality is that the technology will continue to proliferate, will continue to develop and will continue to become sort of as easy as pushing the button,” Adam Dodge, the founder of EndTAB, a group that provides trainings on technology-enabled abuse, told the AP in April. “And as long as that happens, people will undoubtedly … continue to misuse that technology to harm others, primarily through online sexual violence, deepfake pornography and fake nude images.”

Legal Support for Victims of Image-Based Sexual Abuse

Dordulian Law Group (DLG) offers free and confidential consultations for victims of deepfakes, nonconsensual pornography (revenge porn), as well as all forms of image-based child sexual abuse (IBSA) and child sexual abuse material (CSAM). Contact a member of our dedicated team today by calling 866-GO-SEE-SAM.

DLG’s unique 24/7 network of support professionals, known as the SAJE Team (Sexual Assault Justice Experts), was created by Sam Dordulian, a former sex crimes prosecutor in the Los Angeles District Attorney’s Office and member of RAINN’s National Leadership Council.

Ready to file a claim and pursue justice through a financial damages award? Our expert attorneys are available online or by phone now.

Dordulian and his team of Glendale, California, revenge porn and nonconsensual deepfake lawyers are dedicated to helping survivors obtain the justice they deserve through maximum financial damages awards.

Contact us today at 866-GO-SEE-SAM to get justice for your image-based child sexual abuse (IBSA) or child sexual abuse material (CSAM) case.


Go See Sam