Senator Mike Lee to Reintroduce PROTECT Act, Calls for Taylor Swift’s ‘Support’

Home  »  Sex Crimes   »   Senator Mike Lee to Reintroduce PROTECT Act, Calls for Taylor Swift’s ‘Support’

Senator Mike Lee to Reintroduce PROTECT Act, Calls for Taylor Swift’s ‘Support’

Senator Mike Lee to Reintroduce PROTECT Act, Calls for Taylor Swift’s ‘Support’

Jan 31, 2024

Senator Mike Lee of Utah will reintroduce the PROTECT Act Wednesday. The legislation, originally proposed in September 2022, would require adult and pornography websites to implement numerous security safeguards in an effort to protect victims of image-based sexual abuse (IBSA):

  • Verify the age of all participants in pornographic images;
  • Require sites to obtain verified consent forms from individuals uploading content and those appearing in uploaded content;
  • Mandate that websites quickly remove images upon receiving notice they uploaded without consent.

Lee’s reintroduction of the PROTECT Act – originally spearheaded by Uldouz Wallace, a survivor of non-consensual intimate image distribution (or revenge porn) – is expected on the same day that CEOs from Meta, Snap, X, Discord, and TikTok are scheduled to testify before Congress about how their platforms facilitate child sexual abuse material.

Moreover, as news of Taylor Swift becoming the latest victim of deepfake pornography generated by artificial intelligence made headlines last week, Senator Lee appealed to the singer-songwriter for her “support” as the reintroduction of the PROTECT Act was announced.

Via X (formerly Twitter) on Saturday, Lee wrote:

“Hi @taylorswift13 and @treepaine, I have legislation to help get harmful deepfake images removed quickly, and create a way for people to sue companies that don’t take them down. I’m re-introducing the PROTECT Act next week. Would love your support!”

The Deseret News reported that Swift is “furious” about the AI images and is “considering legal action.”

A source close to Swift said on Thursday: ‘Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge,’” a report from the Daily Mail said.

The Verge reported that one post on X featuring graphic deepfake images of Taylor Swift had been viewed more than 45 million times before eventually being removed. Conflicting media reports later claimed that X had temporarily suspended searches on the site for Taylor Swift, but since restored the function.

“Putting quotation marks around her name Monday did allow some posts to appear, though not the lewd images that have stirred controversy,” ABC 5 Cleveland reported.

The PROTECT Act’s reintroduction should serves as a warning to lawmakers, parents, and every American regarding the increasing threat of image-based sexual abuse, including deepfakes and revenge porn. In September, attorneys general from all 50 states called the need to combat the dangers of artificial intelligence-generated child sexual abuse content a “race against time.”

The PROTECT Act, which has received bi-partisan support and is expected to pass, would implement much-needed safeguards to crack down on nefarious actors as well as online platforms which fail to prevent or take action against deepfakes and other types of IBSA.

In an opinion piece for The Hill, Lina Nealon stressed the need for legislation such as the PROTECT Act to be put in place immediately:

“The truth is that, despite all the tools and so-called safety changes they will parade and promote, more and more children are being harmed on their platforms. Big Tech has proven either unwilling or incapable of keeping kids safe online. And whichever one of these it is, we should all be terrified,” Nealon wrote.

The PROTECT Act would establish federal regulations to curb image-based sexual abuse. But the bill sponsored by Senator Lee and created by Uldouz Wallace has already lead to similar legislation being passed in various states as well as internationally. However, as Wallace told Dordulian Law Group, federal laws via the PROTECT Act are absolutely critical.

“The big tech companies and adult websites are never going to voluntarily take action that’s needed to keep people from being victimized by deepfakes and IBSA without federal legislation,” Wallace said. “The PROTECT Act is the piece that’s needed to help ensure that this type of content isn’t uploaded in the first place or disseminated. But it’s also there to help anyone whose victimized – whether you’re a celebrity like Taylor Swift or an everyday person – get those images or videos removed immediately.”

A report from the St. Louis Post-Dispatch echoed Wallace’s sentiment, with a recent headline reading: Taylor Swift deepfakes taken offline. It’s not so easy for regular people

Fake, AI-generated sexually explicit images of Taylor Swift were feverishly shared on social media until X took them down after 17 hours. But many victims of the growing trend lack the means, clout and laws to accomplish the same thing,” the Post-Dispatch said.

To sign the PROTECT Act petition, please click here.

About Foundation RA: Uldouz Wallace’s Nonprofit Working to Stop Image-Based Sexual Abuse

Subsequent to creating the PROTECT Act, Uldouz Wallace founded Foundation RA, a 501(c)3 nonprofit organization that supports children, women, and men who are victims of online image-based sexual abuse. Through Foundation RA, Wallace has pursued the mission to change the laws against all forms of image-based sexual abuse.

Sadly, anyone can become a victim of image-based sexual abuse. It only takes a single picture of someone’s face – whether uploaded to a social media platform, posted to a LinkedIn account, or even shared with a friend/significant other – to end up in the hands of a cybercriminal, and the same nightmare which happened to Uldouz, Taylor Swift, and countless other survivors could happen to you.

Image-based sexual abuse affects far more victims than many people realize:

  • 1 in 8 adult social media users have been targets of image based sexual abuse.
  • 1 in 12 have been victims of image based sexual abuse.
  • 1 in 20 have perpetrated image based sexual abuse.

But Foundation RA is committed to providing resources for survivors of IBSA, including:

  1. Access to Mental Health Professionals: the organization has partnered with Better Help Therapy, who have a large selection of licensed trauma informed therapists.
  2. Take Down Services: the organization offers free access to cyber security experts who are skilled in conducting extensive take downs, including face reverse search and fingerprinting nonconsensual images and videos.
  3. In-Kind Corporate Opportunities: Foundation RA is dedicated to partnering with likeminded corporations and organizations in an effort to help further its mission. Corporations often engage staff in meaningful projects that give back to the community. If you and your team are interested in offering your time and talent to Foundation RA’s grassroots work, their team would love to hear from you.

For Foundation RA to continue helping survivors of IBSA and working towards passing critical legislation that keeps up with the advancements in artificial intelligence, your tax-deductible contribution is essential. To donate to Foundation RA and help protect victims of image-based sexual abuse, please click here.

Dordulian Law Group Supports Uldouz Wallace and Foundation RA’s Mission

Uldouz Wallace has been courageously telling her story publicly for many years, including through the release of a recent film entitled Hacked. In November 2022, Uldouz appeared on the Dr. Phil Show with Dordulian Law Group’s founder, Sam Dordulian, to discuss the subject of revenge porn and how survivors impacted by image-based sexual abuse can take legal action.

In California, current laws give survivors of revenge porn the option to file civil lawsuits for damages. In fact, California was the first state to outlaw revenge porn in 2013.

State law says it is illegal to:

“…distribute the images wherein the victim is identifiable, with the intent to cause serious emotional harm or distress to the victim, thus causing the victim to actually suffer such distress.

For victims of deepfakes, California Assembly Bill 602 establishes a private cause of action against a person who:

  • Creates and intentionally discloses sexually explicit material where the person knows or reasonably should have known the depicted individual did not consent to the creation or disclosure; or
  • Intentionally discloses sexually explicit material that the person did not create and the person knows that the depicted individual did not consent to the creation of the material. A “depicted individual” is an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.

Accordingly, victims of sexually explicit deepfakes (of deepfake porn) may file civil lawsuits against either the individual who created the material and/or the company/website which hosted and allowed dissemination of the content.

But without the passage of the PROTECT Act, adult websites and cybercriminals will not be held accountable, and the proliferation of image-based sexual abuse will continue to claim more and more innocent victims.

Ready to file a claim and pursue justice through a financial damages award? Our expert attorneys are available online or by phone now.

To learn more about the important work being done by Uldouz Wallace and Foundation RA, please click here.


Go See Sam