Legal Options for California Deepfake Porn Victims Explained

Home  »  Sex Crimes   »   Can I Sue as a Victim of Deepfake Porn?

Can I Sue as a Victim of Deepfake Porn?

Can I Sue as a Victim of Deepfake Porn?

Jan 4, 2024

Artificial intelligence (AI) has been touted for its many benefits since the release of ChatGPT in early 2023. But a darker side of AI – deepfake porn – has not received nearly as much media attention despite its potential to adversely impact countless victims.

Legal Options for California Deepfake Porn Victims Explained

New AI technology makes the process of creating deepfakes easy and widely accessible. In addition to producing deepfake pornography, nefarious actors can use AI to alter real news anchors to report fake stories, disseminate health care misinformation, and hijack unsuspecting victims’ voices to create financial fraud.

But deepfake porn could be the most dangerous aspect of AI with its potential to victimize millions of unsuspecting individuals. By simply accessing and altering a photo or video, a criminal can create and disseminate a deepfake.

A recent post from Wired reported that “deepfake porn is out of control.”

“Google’s and Microsoft’s search engines have a problem with deepfake porn videos. Since deepfakes emerged half a decade ago, the technology has consistently been used to abuse and harass women-using machine learning to morph someone’s head into pornography without their permission. Now the number of nonconsensual deepfake porn videos is growing at an exponential rate, fueled by the advancement of AI technologies and an expanding deepfake ecosystem,” Wired said.

This blog will review the issue of deepfake porn in greater detail and provide information for victims on how to secure justice through civil lawsuits.

How is AI-Generated Deepfake Porn Created?

Deepfakes are defined as videos or images created digitally or altered using artificial intelligence.

Deepfake porn is an issue which can impact anyone. If you have a picture or video of yourself uploaded anywhere on the internet – whether on social media, LinkedIn, a company/corporate website, etc. – you could become a victim of deepfake porn.

AI-generated deepfake porn can be easily created by altering an image or video to:

  • Overlay the face of one person onto the body of another (such as an adult film actor)
  • Depict materials that swap a victim’s face onto the face of someone who was sexually abused
  • Alter the likeness of a person from an image or video taken from social media (so that it depicts a sexual scene)

A recent report from the Associated Press (AP) noted that AI deepfakes are becoming more prevalent as well as sophisticated. The AP reported:

  • Expanding technology makes it easier to create sophisticated and visually compelling deepfakes.
  • The problem could get worse with the development of generative artificial intelligence tools that are trained on billions of images from the internet and spit out novel content using existing data.

The reality is that the technology will continue to proliferate, will continue to develop and will continue to become sort of as easy as pushing the button,” Adam Dodge, the founder of EndTAB, a group that provides trainings on technology-enabled abuse, told the AP in 2023.

“And as long as that happens, people will undoubtedly … continue to misuse that technology to harm others, primarily through online sexual violence, deepfake pornography and fake nude images.”

The first known case of deepfake porn occurred years ago when content created through AI-generated technology was spread across the internet. Clips depicting the faces of women celebrities were superimposed on the shoulders or porn actors and then shared on Reddit.

Can I Sue as a Victim of Deepfake Pornography?

California was the first state in the nation to pass laws in an effort to combat deepfakes and provide victims with legal recourse.

In 2019, Governor Gavin Newsom signed Assembly Bill 602 (AB 602) and Assembly Bill 730 (AB 730) into law. AB 730 focuses on deepfakes which are created to influence political campaigns. AB 602 focuses on deepfakes depicting sexually explicit material.

Under California Assembly Bill 602, which took effect on January 1, 2020, a private cause of action is created against a person who:

  • Creates and intentionally discloses sexually explicit material where the person knows or reasonably should have known the depicted individual did not consent to the creation or disclosure; or
  • Intentionally discloses sexually explicit material that the person did not create and the person knows that the depicted individual did not consent to the creation of the material. A “depicted individual” is an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.

Accordingly, victims of sexually explicit deepfakes (of deepfake porn) may file civil lawsuits against either the individual who created the material and/or the company/website which hosted and allowed dissemination of the content.

As a victim of California deepfake porn, you may be able to recover the following damages/financial compensation through a civil claim with Dordulian Law Group’s sex crimes attorneys:

  1. Either (a) economic and noneconomic damages commensurate with emotional distress caused, or (b) statutory damages of at least $1,500 but no more than $30,00 (or, if the act was committed with malice, up to $150,000)
  2. Punitive damages;
  3. Attorney’s fees and costs;
  4. Injunctive relief

While websites and social media platforms such as Instagram, Facebook, and TikTok may not be taking the appropriate steps to curb deepfake porn content dissemination as recommended by regulators, California AB 602 has been hailed as an important step in helping to provide victims with an avenue for securing justice.

An estimated 96% of deepfakes posted online are sexually explicit, according to a study from cybersecurity company Deeptrace. Additionally, 99% of those deepfakes posted across the internet depict women who work in entertainment.

We are absolutely thrilled that governor Newsom stood by the victims, most of whom are women, of non-consensual pornography by signing AB 602 into law,” Gabrielle Carteris, president of the Screen Actors Guild, told Deadline in 2019.

Contact Our Deepfake Porn Attorneys for a Free Consultation Today

Dordulian Law Group (DLG) is a leading and top-rated California-based sex crimes firm with experience representing clients in deepfake and revenge porn cases.

Our founder and president, Sam Dordulian, is a former sex crimes prosecutor. As a Deputy District Attorney for Los Angeles County, Dordulian obtained life sentences against countless sexual predators. Today, Dordulian fights for justice on behalf of sexual assault and deepfake porn survivors in civil court.

Dordulian recently appeared on the Dr. Phil Show as a featured expert on revenge/deepfake porn. With over $200,000,000.00 in settlements and verdicts obtained on behalf of clients, Dordulian and his team provide victims of deepfake porn with much-needed peace of mind and confidence throughput the entire legal process. Don’t settle for a deepfake porn lawyer with a lack of experience or results.

To arrange for a free, confidential, and no obligation consultation regarding your deepfake porn case, contact a member of the DLG SAJE Team (Sexual Assault Justice Experts) today by calling 866-GO-SEE-SAM.

DLG’s experienced sex crimes attorneys have secured countless multimillion dollar settlements for sexual assault, revenge porn, and deepfake cases on behalf of our clients:

  • A $2,250,000.00 settlement for a survivor raped by a rideshare driver
  • A confidential multimillion dollar settlement for a client who was raped by a man she met on a ‘Sugar Daddy’ website
  • A $2,000,000.00 child sexual abuse settlement under California AB 218
  • A confidential maximum financial settlement for a women assaulted by an employer – although the incident involved minimal contact, our attorneys were able to secure the damages award under the eggshell plaintiff rule

Can Anyone’s Internet Image be Turned Into Deepfake Nonconsensual Pornography?Can Anyone's Internet Image be Turned Into Deepfake Nonconsensual Pornography?

Our Sexual Assault Justice Experts are here to help survivors secure justice. Contact our top-rated attorneys online or by phone for a free consultation today.

We are dedicated to helping survivors like you obtain the justice you deserve through maximum financial damages awards.

Sexual Abuse/Assault Cases

Revenge Porn cases
College Campus Rape & Sexual Assault Attorneys
Stealthing - Nonconsensual Condom Removal
Doctor Sexual Assault & Abuse
School Sexual Abuse
Rape Victims Attorney

Child Sexual Abuse
Clergy Sexual Abuse
Teacher & Coach Sexual Abuse
Airbnb/Vrbo Rape and Sexual Assault Attorneys
Nursing Home Abuse
Elder/Nursing Home Sexual Abuse

Sexual Assault/Abuse
Uber Sexual Assault Lawyers
LYFT Sexual Assault Lawyer
Sexual harassement

Foster Care Abuse
Sexual Assault
Boy Scout Sexual Abuse
School Bullying

Sugar Daddy sexual abuse
Child Sexual Grooming
Juvenile Hall Sexual Abuse
Youth Sports Sexual Abuse

Child abuse

Contact us today at 866-GO-SEE-SAM to get justice for your deepfake porn case.

Author

Samuel Dordulian

Samuel Dordulian, founder

Sam Dordulian is an award-winning sexual abuse lawyer with over 25 years' experience helping survivors secure justice. As a former sex crimes prosecutor and Deputy District Attorney for L.A. County, he secured life sentences against countless sexual predators. Mr. Dordulian currently serves on the National Leadership Council for RAINN.




Go See Sam