Aug 22, 2024
On August 15, 2024, San Francisco City Attorney David Chiu announced a groundbreaking lawsuit against the operators of 16 AI-powered “undressing” websites. These sites, which use artificial intelligence to create and distribute non-consensual nude images of women and girls, are now facing legal action for violating state and federal image-based sexual abuse (IBSA) laws.
The lawsuit accuses the website operators of breaching laws that prohibit:
While the names of the websites were redacted in the public version of the suit, the city attorney’s office aims to identify the owners and hold them accountable. A report from The Verge noted that the lawsuit targets 16 of the most visited undressing websites used to create fake nude images of real women and children.
San Francisco’s landmark deepfake legal action serves two primary purposes:
The process behind these nonconsensual AI nude images is shockingly simple. Nefarious users upload photos of fully-clothed individuals to these websites. Then, artificial intelligence algorithms alter the images to simulate what the person might look like undressed. The result is a pornographic image created without the subject’s knowledge or consent.
San Francisco’s filing cited how one of the undressing websites is brazenly promoting its ability to create nonconsensual nudes and deepfakes, stating:
“Imagine wasting time taking her out on dates, when you can just
use [redacted website name] to get her nudes.“
The accessibility of open-source AI models has made it possible for anyone to adapt AI-powered engines for various purposes, including the creation of deepfake nudes. Some sites and apps can generate these images from scratch, while others “nudify” existing photos in alarmingly realistic ways – often for a fee.
The proliferation of AI-generated deepfake nudes has had a devastating impact on countless women and girls worldwide. From celebrities like Taylor Swift to local Beverly Hills middle school students, no one seems immune to this form of exploitation.
San Francisco City Attorney Chiu emphasized the severity of the situation:
“These images are used to bully, humiliate, and threaten women and girls. The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”
The scale of the deepfake image problem is staggering and reportedly only getting worse. Officials with Homeland Security and the FBI have referred to the spread of deepfakes across the internet as a growing threat.
The city attorney’s office investigation revealed that the 16 undressing websites in question were visited more than 200 million times in just the first six months of 2024.
Protecting victims of AI-generated deepfake nudes presents unique challenges. Once an image is online, it’s extremely difficult for victims to:
Yvonne R. Meré, San Francisco’s chief deputy city attorney, said that these manipulated images “don’t have any unique or identifying marks that link you back to websites,” making it nearly impossible for victims to trace their origin.
While existing laws against revenge porn and child pornography provide some legal recourse, the rapid advancement of AI technology has created new challenges for lawmakers and law enforcement agencies. There is a growing need for improved legislation and technological solutions to address this issue effectively.
Although no federal legislation currently exists in the United States to ban or even regulate deepfakes, laws have been introduced and certain states have taken action.
Some of the federal legislation proposed which would provide significant protections for deepfake victims include:
The DEFIANCE Act passed the Senate in July. If enacted into law, victims of nonconsensual sexually explicit deepfakes would be able to sue people and/or entities who create, share, and receive this type of nonconsensual content.
In 2019, California became the first state to pass laws aimed at combatting nonconsensual deepfakes. Assembly Bill 602 (AB 602) and Assembly Bill 730 (AB 730) provide victims with legal recourse. The two pieces of legislation focus on different types of intent:
As of January 1, 2020, AB 602 ensures that a private cause of action is created against a person who:
California victims of deepfake porn may now file civil lawsuits against either the individual who created the material and/or the company/website responsible for hosting and/or dissemination of the content.
As a victim of deepfake porn, Dordulian Law Group’s image-based sexual abuse lawyers can help you file a claim seeking to recover financial compensation. Damages included in a deepfake porn civil lawsuit may include:
Laws such as the DEFIANCE Act and California AB 602 were established because websites and social media platforms like Instagram, Facebook, and TikTok had taken few steps to curb the dissemination of deepfakes, even when countless recommendations have been issued by regulators.
California AB 602 has been hailed as landmark legislation which will be a critical step in helping to provide victims with an avenue for securing justice.
The impact of AI-generated deepfake nudes often extends beyond adult victims, affecting younger individuals as well. A recent incident in Beverly Hills highlights the severity of this issue in educational settings:
Unfortunately, this is not an isolated incident. Similar cases have been reported in schools across California, Washington, and New Jersey. These events underscore the urgent need for education and prevention measures to protect young people from the harmful effects of deepfake technology.
Given the gravity and scope of the matter, it’s clear that addressing the issue of AI-generated deepfakes will require a multifaceted approach:
The fight against AI-generated deepfake nudes is only beginning. However, concerted efforts from various sectors of society can mean working towards a safer digital environment for all. As the call to innovate and push the boundaries of AI technology continues, it’s imperative that we also prioritize the protection of individuals’ privacy, dignity, and well-being.
If you experienced a sexual assault incident, don’t wait to file a claim. Contact our expert attorneys online or by phone for a free consultation today.
In 2023, more nonconsensual sexually explicit deepfakes were posted online than in all previous years combined. A recent study from cybersecurity company Deeptrace confirmed that an estimated 96% of deepfakes are sexually explicit. Additionally, 99% of those deepfakes posted across the internet depict women who work in entertainment.
In some jurisdictions, creating and distributing AI-generated deepfake nudes without consent is illegal. However, the laws vary by region and are still evolving to keep up with technological advancements. California provides protections to deepfake victims, and if you would like to discuss your case with a Los Angeles deepfake porn victim lawyer, contact Dordulian Law Group today at 866-GO-SEE-SAM for a confidential and free consultation.
While it's challenging to completely prevent this risk if there is an image of you anywhere on the internet, you can take steps such as being cautious about sharing photos on various online platforms, using privacy settings on social media, and being aware of the risks associated with image-sharing platforms.
If you discover such images, document the evidence, report the content to the platform where it's hosted, and consider seeking legal advice from an experienced deepfake porn victim lawyer. You may also want to contact local law enforcement and organizations that support victims of online abuse.
Many tech companies are developing AI detection tools, implementing stricter content moderation policies, and collaborating with law enforcement agencies to combat the spread of nonconsensual deepfake content.
– What are Some Examples of Image-Based Sexual Abuse?
– Can I Sue as a Victim of Deepfake Porn?
– ‘Race Against Time’ to Bar AI-Generated Child Sexual Abuse Content
– California Bill Aims to Criminalize AI-Generated Nonconsensual Pornography
– New Study Indicates Global Regulators ‘Failing’ to Combat Revenge Porn
– ‘Another Body’ Documentary Exposes Deepfake Porn Dangers
– Victim of revenge porn?
Sam Dordulian is an award-winning sexual abuse lawyer with over 25 years' experience helping survivors secure justice. As a former sex crimes prosecutor and Deputy District Attorney for L.A. County, he secured life sentences against countless sexual predators. Mr. Dordulian currently serves on the National Leadership Council for RAINN.
Do you have a case? Let's find out.