Current:Home > InvestCan Taylor Swift sue over deepfake porn images? US laws make justice elusive for victims. -WealthMap Solutions
Can Taylor Swift sue over deepfake porn images? US laws make justice elusive for victims.
View
Date:2025-04-17 04:58:35
Faked sexually-explicit images of global megastar and singer Taylor Swift have spread through the internet and prompted outrage this week, highlighting how rapidly an explicit image doctored by artificial intelligence can spread.
It's an unfolding controversy that also shows how few clear legal protections exist for victims in a world where AI has burst on the scene in only a few short years and can generate images of nearly anything without the consent of people depicted.
USA TODAY was only able to identify 10 states that have passed laws banning exploitative deepfake pornography, or AI-generated images, audio files or videos with sexual content. There is no federal law regulating it.
That means the question of whether the depictions are actually against the law is messy, and leaves victims like Swift many confusing options.
It's possible that the faked images could result in criminal charges, but it's more likely that victims get justice by suing companies involved in the images' creation or proliferation. That's according to Carrie Goldberg, a victims’ rights attorney who has taken on tech companies and represented clients who were victims of nonconsensual porn, stalking and harassment and, now, deepfake pornography.
Goldberg also notes that lawsuits are a much more practical solution for a wealthy celebrity than they are for someone with less influence, who might also be the victim of deepfake porn.
As the technology to create deepfakes only became available in 2017, legal remedies are still being created and little has been settled as to what, exactly is illegal.
In what states is deepfake porn illegal?
On Friday, USA TODAY was only able to find 10 states which appear to have laws that specifically address the issue of pornographic deepfakes. The earliest law, in Virginia, dates from 2019.
A small number of states have existing laws about the nonconsensual distribution of pornography – or “revenge porn” – that may also cover AI-generated pornography, said Goldberg.
But for many states, those laws are written in a way that implies the images must be of the victim’s own private body parts, not parts that are generated by AI.
This means that for now, the only states where deepfake victims have specific legal remedies are these:
- California: In 2020 California passed a law allowing victims of deepfake pornography to sue those who create and distribute sexually explicit deepfake material if the victim did not consent to it. Victims can sue for up to $150,000 if the deepfake was "committed with malice."
- Florida: In 2022 Florida passed a law that prohibits the dissemination of sexually explicit deepfake images without the victim’s consent. It is a third-degree felony with a maximum sentence of five years in prison, a $5,000 fine and five years' probation.
- Georgia: A 2020 Georgia law banned the online dissemination of falsely created pornographic images or videos.
- Hawaii: In 2021 Hawaii outlawed the intentional creation, disclosure, or threat of disclosure of nonconsensual sexually explicit deep fake images or videos. It is a Class C Felony, punishable by up to five years imprisonment and a fine of up to $10,000.
- Illinois: On January 1, 2024 Illinois added new protections for victims of deepfake porn. The law allows anyone falsely depicted in sexually explicit images or videos to sue the person who created the material. The law amends existing protections passed in 2015 for victims of revenge porn. Victims can sue for damages and, to maintain their privacy, can use a pseudonym in court.
- Minnesota: A 2023 Minnesota law made it illegal to create sexually explicit deepfakes and to use deepfakes to influence an election. This can include up to five years in prison and $10,000 in fines for distributing the images or videos.
- New York: In 2023 the state banned the distribution of pornographic images made using artificial intelligence without the consent of the subject. Violators can face up to a year in jail and a $1,000 fine. Victims also have the right to sue.
- Texas: In 2023 Texas made it a Class A misdemeanor to create sexually explicit, non-consensual deepfake videos punishable by up to one year in jail, a fine of up to $4,000, or both jail time and a fine.
- South Dakota: A 2022 law made it a Class 1 misdemeanor to create deepfake pornography of an unwilling victim. If the victim is under 17 and the perpetrator at least 21, it is a Class 6 felony punishable by two years imprisonment, a fine of up to $4,000 or both.
- Virginia: The state law was passed in 2019, as part of an existing law relating to revenge porn. The update adds "falsely created videographic or still image." It is punishable by up to a year in jail or a fine of $2,500 or both.
The trauma of deepfake porn:She discovered a naked video of herself online, but it wasn't her
Were the fake images of Taylor Swift a crime? Can she sue?
Even with 10 states having laws on the books covering deepfake porn, criminal laws may not be the most practical solution for a victim, said Goldberg.
For one thing, law enforcement would have to prioritize investigating a case, and for another, it can be a big web of perpetrators to track down from whoever created the content to anybody who shared it.
Her focus as an attorney would be on going after the AI product – the company or platform that was used to create the deepfake porn – and the tech platforms that enabled its use, such as app stores where the product could be downloaded, and possibly even social media companies where the imagery is shared.
Taylor Swift could sue such companies or platforms, Goldberg said.
Tennessee, where Swift lives, doesn’t have a law explicitly banning deepfake porn. Tennessee Governor Bill Lee proposed one earlier this month. Called the Ensuring Likeness Voice and Image Security (ELVIS) Act, it would update the state's Protection of Personal Rights law to include protections for songwriters, performers, and music industry professionals’ voice from the misuse of artificial intelligence and would also include pornographic deepfakes.
In Swift's case, the star also spends a lot of time in New York, and New York does have both criminal and civil options for victims. Even without the criminal laws, she could sue civilly, focusing on the misappropriation of her likeness.
Even if it turned out that the perpetrators were not in the U.S., Swift’s massive power and influence could help her in that case, too. Goldberg said she represented a celebrity a few years ago whose image was superimposed in porn scenes, which she fought to have removed from foreign sites.
“When you’re Taylor Swift, there’s always going to be recourse,” Goldberg said. “There are a lot more options for people who have resources like she has, where she can get law enforcement from other countries to care. That's not available to most people.”
Sexually explicit Taylor Swift AI imagesCirculate online, prompt backlash
What's next as AI image generation becomes more mainstream?
More laws governing deepfake pornography are anticipated in the coming years at the state level.
A possible federal NO FRAUD AI Act was circulated in 2023 in draft form but many in the technological world believe it is too broad and unspecific to be workable.
Future legislation at a federal and state level would require the consensus of lawmakers and more celebrities using their voices to draw attention to the issue, Goldberg said.
“As a society, just like we did with nonconsensual pornography, to just turn the tables on what we'll tolerate, and make it so that people who might share or like or relink or post to this kind of content are deterred from doing that,” Goldberg said.
Contact Kayla Jimenez at kjimenez@usatoday.com. Follow her on X, formerly Twitter, at @kaylajjimenez. Elizabeth Weise at eweise@usatoday.com
veryGood! (415)
Related
- Former Danish minister for Greenland discusses Trump's push to acquire island
- A Triple Whammy Has Left Many Inner-City Neighborhoods Highly Vulnerable to Soaring Temperatures
- What Germany Can Teach the US About Quitting Coal
- Pharrell Williams succeeds Virgil Abloh as the head of men's designs at Louis Vuitton
- The Super Bowl could end in a 'three
- Global Warming Cauldron Boils Over in the Northwest in One of the Most Intense Heat Waves on Record Worldwide
- Florida ocean temperatures peak to almost 100 degrees amid heatwave: You really can't cool off
- Tom Cruise's Mission: Impossible Costars Give Rare Glimpse Into His Generous On-Set Personality
- Newly elected West Virginia lawmaker arrested and accused of making terroristic threats
- Reimagining Coastal Cities as Sponges to Help Protect Them From the Ravages of Climate Change
Ranking
- Paula Abdul settles lawsuit with former 'So You Think You Can Dance' co
- Tom Cruise's Mission: Impossible Costars Give Rare Glimpse Into His Generous On-Set Personality
- Lisa Marie Presley died of small bowel obstruction, medical examiner says
- Catholic Bishops in the US Largely Ignore the Pope’s Concern About Climate Change, a New Study Finds
- 2025 'Doomsday Clock': This is how close we are to self
- House approves NDAA in near-party-line vote with Republican changes on social issues
- DNA from pizza crust linked Gilgo Beach murders suspect to victim, court documents say
- 5 dead, baby and sister still missing after Pennsylvania flash flooding
Recommendation
Cincinnati Bengals quarterback Joe Burrow owns a $3 million Batmobile Tumbler
André Leon Talley's belongings, including capes and art, net $3.5 million at auction
Dawn Goodwin and 300 Environmental Groups Consider the new Line 3 Pipeline a Danger to All Forms of Life
The IRS now says most state relief checks last year are not subject to federal taxes
Senate begins final push to expand Social Security benefits for millions of people
Checking back in with Maine's oldest lobsterwoman as she embarks on her 95th season
This week on Sunday Morning (July 16)
She left her 2007 iPhone in its box for over a decade. It just sold for $63K