San Francisco's Landmark Lawsuit Against Deepfake Pornography Websites

Home  »  Sex Crimes   »   San Francisco Sues 16 ‘Undressing’ Websites Used to Create Fake Nudes

San Francisco Sues 16 ‘Undressing’ Websites Used to Create Fake Nudes

Aug 22, 2024

On August 15, 2024, San Francisco City Attorney David Chiu announced a groundbreaking lawsuit against the operators of 16 AI-powered “undressing” websites. These sites, which use artificial intelligence to create and distribute non-consensual nude images of women and girls, are now facing legal action for violating state and federal image-based sexual abuse (IBSA) laws.

The lawsuit accuses the website operators of breaching laws that prohibit:

  • Deepfake pornography
  • Revenge pornography
  • Child pornography
  • California’s unfair competition law

While the names of the websites were redacted in the public version of the suit, the city attorney’s office aims to identify the owners and hold them accountable. A report from The Verge noted that the lawsuit targets 16 of the most visited undressing websites used to create fake nude images of real women and children.

San Francisco’s landmark deepfake legal action serves two primary purposes:

How AI “Undressing” Websites Work

How AI Undressing Websites Work
The process behind these nonconsensual AI nude images is shockingly simple. Nefarious users upload photos of fully-clothed individuals to these websites. Then, artificial intelligence algorithms alter the images to simulate what the person might look like undressed. The result is a pornographic image created without the subject’s knowledge or consent.

San Francisco’s filing cited how one of the undressing websites is brazenly promoting its ability to create nonconsensual nudes and deepfakes, stating:

Imagine wasting time taking her out on dates, when you can just
use [redacted website name] to get her nudes.

The accessibility of open-source AI models has made it possible for anyone to adapt AI-powered engines for various purposes, including the creation of deepfake nudes. Some sites and apps can generate these images from scratch, while others “nudify” existing photos in alarmingly realistic ways – often for a fee.

The Impact of Nonconsensual AI Nude Images on Victims

The Impact of Nonconsensual AI Nude Images on Victims

The proliferation of AI-generated deepfake nudes has had a devastating impact on countless women and girls worldwide. From celebrities like Taylor Swift to local Beverly Hills middle school students, no one seems immune to this form of exploitation.

San Francisco City Attorney Chiu emphasized the severity of the situation:

“These images are used to bully, humiliate, and threaten women and girls. The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”

The scale of the deepfake image problem is staggering and reportedly only getting worse. Officials with Homeland Security and the FBI have referred to the spread of deepfakes across the internet as a growing threat.

The city attorney’s office investigation revealed that the 16 undressing websites in question were visited more than 200 million times in just the first six months of 2024.

Deepfake Victim Protections and Legal Challenges

Protecting victims of AI-generated deepfake nudes presents unique challenges. Once an image is online, it’s extremely difficult for victims to:

  • Determine which websites were used to “nudify” their images
  • Remove the images from the internet
  • Identify the perpetrators behind the abuse

Yvonne R. Meré, San Francisco’s chief deputy city attorney, said that these manipulated images “don’t have any unique or identifying marks that link you back to websites,” making it nearly impossible for victims to trace their origin.

While existing laws against revenge porn and child pornography provide some legal recourse, the rapid advancement of AI technology has created new challenges for lawmakers and law enforcement agencies. There is a growing need for improved legislation and technological solutions to address this issue effectively.

Can Deepfake Porn Victims Sue?

Although no federal legislation currently exists in the United States to ban or even regulate deepfakes, laws have been introduced and certain states have taken action.

Some of the federal legislation proposed which would provide significant protections for deepfake victims include:

The DEFIANCE Act passed the Senate in July. If enacted into law, victims of nonconsensual sexually explicit deepfakes would be able to sue people and/or entities who create, share, and receive this type of nonconsensual content.

In 2019, California became the first state to pass laws aimed at combatting nonconsensual deepfakes. Assembly Bill 602 (AB 602) and Assembly Bill 730 (AB 730) provide victims with legal recourse. The two pieces of legislation focus on different types of intent:

  • AB 730 focuses on deepfakes which are created to influence political campaigns.
  • AB 602 focuses on deepfakes depicting sexually explicit material.

As of January 1, 2020, AB 602 ensures that a private cause of action is created against a person who:

  • Creates and intentionally discloses sexually explicit material where the person knows or reasonably should have known the depicted individual did not consent to the creation or disclosure; or
  • Intentionally discloses sexually explicit material that the person did not create and the person knows that the depicted individual did not consent to the creation of the material. A “depicted individual” is an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.

California victims of deepfake porn may now file civil lawsuits against either the individual who created the material and/or the company/website responsible for hosting and/or dissemination of the content.

As a victim of deepfake porn, Dordulian Law Group’s image-based sexual abuse lawyers can help you file a claim seeking to recover financial compensation. Damages included in a deepfake porn civil lawsuit may include:

  1. Either (a) economic and noneconomic damages commensurate with emotional distress caused, or (b) statutory damages of at least $1,500 but no more than $30,00 (or, if the act was committed with malice, up to $150,000)
  2. Punitive damages;
  3. Attorney’s fees and costs;
  4. Injunctive relief

Laws such as the DEFIANCE Act and California AB 602 were established because websites and social media platforms like Instagram, Facebook, and TikTok had taken few steps to curb the dissemination of deepfakes, even when countless recommendations have been issued by regulators.

California AB 602 has been hailed as landmark legislation which will be a critical step in helping to provide victims with an avenue for securing justice.

Real-World Examples of Deepfake Cyberbullying

The impact of AI-generated deepfake nudes often extends beyond adult victims, affecting younger individuals as well. A recent incident in Beverly Hills highlights the severity of this issue in educational settings:

  • Five eighth-grade students were expelled for creating and sharing deepfake nude images of 16 of their female classmates.
  • The perpetrators used AI technology to superimpose the girls’ faces onto AI-generated nude bodies.

Unfortunately, this is not an isolated incident. Similar cases have been reported in schools across California, Washington, and New Jersey. These events underscore the urgent need for education and prevention measures to protect young people from the harmful effects of deepfake technology.

The Future of Combating AI-Generated Deepfakes

Given the gravity and scope of the matter, it’s clear that addressing the issue of AI-generated deepfakes will require a multifaceted approach:

  1. Technological Advancements: Developing more sophisticated detection and prevention tools is critical. AI researchers and cybersecurity experts are working on algorithms that can identify and report manipulated images and videos more accurately.
  2. Education and Awareness Campaigns: Increasing public awareness about the dangers of deepfake technology and teaching digital literacy skills can help individuals protect themselves and others from exploitation.
  3. Collaboration Between Stakeholders: Tech companies, lawmakers, and advocacy groups must work together to create comprehensive solutions. This includes developing better policies, improving reporting mechanisms, and supporting victims of deepfake abuse.
  4. Strengthening Legal Frameworks: As technology evolves, so must our laws. Lawmakers at the federal, state, and local levels need to update existing legislation and create new laws specifically addressing the unique challenges posed by AI-generated deepfakes.

The fight against AI-generated deepfake nudes is only beginning. However, concerted efforts from various sectors of society can mean working towards a safer digital environment for all. As the call to innovate and push the boundaries of AI technology continues, it’s imperative that we also prioritize the protection of individuals’ privacy, dignity, and well-being.

If you experienced a sexual assault incident, don’t wait to file a claim. Contact our expert attorneys online or by phone for a free consultation today.

FAQ (Frequently Asked Questions About San Francisco’s Deepfake Porn Lawsuit)

In 2023, more nonconsensual sexually explicit deepfakes were posted online than in all previous years combined. A recent study from cybersecurity company Deeptrace confirmed that an estimated 96% of deepfakes are sexually explicit. Additionally, 99% of those deepfakes posted across the internet depict women who work in entertainment.

In some jurisdictions, creating and distributing AI-generated deepfake nudes without consent is illegal. However, the laws vary by region and are still evolving to keep up with technological advancements. California provides protections to deepfake victims, and if you would like to discuss your case with a Los Angeles deepfake porn victim lawyer, contact Dordulian Law Group today at 866-GO-SEE-SAM for a confidential and free consultation.

While it's challenging to completely prevent this risk if there is an image of you anywhere on the internet, you can take steps such as being cautious about sharing photos on various online platforms, using privacy settings on social media, and being aware of the risks associated with image-sharing platforms.

If you discover such images, document the evidence, report the content to the platform where it's hosted, and consider seeking legal advice from an experienced deepfake porn victim lawyer. You may also want to contact local law enforcement and organizations that support victims of online abuse.

Many tech companies are developing AI detection tools, implementing stricter content moderation policies, and collaborating with law enforcement agencies to combat the spread of nonconsensual deepfake content.

Read also

What are Some Examples of Image-Based Sexual Abuse?
Can I Sue as a Victim of Deepfake Porn?
‘Race Against Time’ to Bar AI-Generated Child Sexual Abuse Content
California Bill Aims to Criminalize AI-Generated Nonconsensual Pornography
New Study Indicates Global Regulators ‘Failing’ to Combat Revenge Porn
‘Another Body’ Documentary Exposes Deepfake Porn Dangers
Victim of revenge porn?

Author

Samuel Dordulian

Samuel Dordulian, founder

Sam Dordulian is an award-winning sexual abuse lawyer with over 25 years' experience helping survivors secure justice. As a former sex crimes prosecutor and Deputy District Attorney for L.A. County, he secured life sentences against countless sexual predators. Mr. Dordulian currently serves on the National Leadership Council for RAINN.




Go See Sam