Pressure grows on Congress to take action against deepfake pornography

Deepfake pornography uses technology to make explicit images appear to be someone they’re not. Images using Taylor Swift’s face that surfaced recently on social media have brought the issue front and center, and the problem seems to be getting worse as AI tools become more sophisticated and widespread. John Yang speaks with tech journalist Laurie Segall to learn more.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

  • John Yang:

    Deepfake pornography uses technology to make explicit photographs that appear to be of someone they're not. Images using Taylor Swift's face that surface recently have brought the problem front and center. Those images were viewed 45 million times before they were removed from the social media platform X. And victims aren't just high profile celebrities.

    The problem seems to get worse every year as technology becomes more sophisticated and more widespread. The targets can suffer trauma so severe that it could lead to thoughts of suicide.

    Tech journalist Laurie Segall is the founder and CEO of Mostly Human Media, an entertainment company focusing on the intersection of technology and humanity. Laurie, who is behind these things? Who does it? And what are their motivations?

  • Laurie Segall, CEO, Mostly Human Media:

    Well, it's such a good question. I think it's a hard answer. Because there are all sorts of people who are doing this type of thing. And it's harder and harder to detect. I think one of the things I worry about is, you know, it's going to be very difficult for let's say Taylor Swift once to go track down the perpetrators behind the folks who posted those images on X.

    But the problem with this is now we're creating a completely new arena for abuse, because you have the democratization of these apps that now enable you with a couple clicks to create an AI generated pornography of your crush. There's an app that allows you to in a couple seconds just digitally undress someone.

    And so we're seeing these apps come out, that are not only going to make a whole new generation of victims, but also perpetuate a whole new generation of abusers and of young men who might just think this is a game, but it actually has very real harm.

  • John Yang:

    What's been a phenomenon for a while, but has AI made it easier?

  • Laurie Segall:

    I remember covering non-consensual pornography back in 2015. And the state laws had yet to catch up. And I was — I just remember thinking God, this is such a horrific type of harm where you know, perpetrators go and they post a photo of an X on some of these online forums popping up. And it was really difficult for women to fight against this because the laws hadn't caught up.

    Now, I think one of the reasons I am so concerned about this type of technologically advanced like harm is now you didn't even have to take the photo, right? You could say this isn't real, but it looks very real. And you it's hard to decipher whether it is or not. And most importantly, the harm is real.

  • John Yang:

    And you say that there are laws against non-consensual pornography. But are there laws against this against doing it to somebody you don't know?

  • Laurie Segall:

    There are a handful of laws at the state level that deal with deep fake pornography, they vary in scope. And I'll give Taylor Swift as an example. She has jurisdiction here in New York. And so she might be able to file criminal or civil charges.

    But in order for Taylor to actually go do that, they would have to track down the criminals behind this, which would mean a lot of time and resources that maybe someone like Taylor Swift does. But most people do not have an a difference with deep fake pornography and the laws that exist here in New York is you have to prove intent to harm.

    So then Taylor Swift would actually have to go and say, you know, they wanted to harm me. But it's harder to do with deep fake pornography. People could say they wanted to make money or gain notoriety. And so those laws that vary in scope aren't real. They aren't similar to the ones with non-consensual pornography. And there are a lot of nuances that we have to talk about.

  • John Yang:

    What sort of changes in the laws would you like to see happen?

  • Laurie Segall:

    When this happened, I immediately got on the phone with so many of the women and the lawyers who have been at the forefront of non-consensual pornography, and they've been talking about deep fake pornography and the threat for the last couple of years. And that conversation is even more pressing today.

    Marian Franks, who helps a lot of these victims said that there's a bipartisan federal bill, right. This is a federal bill that's been introduced called preventing deep fakes for intimate images that would actually give recourse in the right way to victims from both criminal standpoint and civil standpoint, laws like that, I think, you know, separately.

    I also think, you know, tech companies need to be instituting a lot of technology at a quicker rate to be able to fight the fight technology, almost say it's like AI needs to fight AI.

  • John Yang:

    The fact that this has now happened to someone as high profile as Taylor Swift, is that going to drive changes in the law?

  • Laurie Segall:

    I mean, I hope so. I mean, she's created micro economies people pay attention. She's helped shift fundamentally shift the music industry, because she fought for ownership over her songs. I mean, imagine if someone like Taylor Swift could take on this problem and fight for the future ownership of our bodies online as women.

    I think I would put my eggs in Taylor Swift's basket I mean. I hate that this happened to her but Taylor Swift is just the tip of the iceberg and I think what happened her represents a threat for all young women and an all girls when it comes to the future of our consent online.

  • John Yang:

    Laurie Segall of Mostly Human Media. Thank you very much.

  • Laurie Segall:

    Thank you.

Listen to this Segment