A woman sits in front of a computer with her head in her hands
Shutterstock

There’s no such thing as revenge porn

Terms like “revenge porn” have a way of pasting over what’s really happening – a crime. First, Jasmine Mithani of The 19th News joins host Krys Boyd to discuss the damaging effect of labeling all sexually explicit images as “porn” – and how victim advocates are working to rebrand these terms. Her article is “They’re crimes — so why do we keep calling them ‘porn’?” Then, Hollie Toups, a victim of pictures shared without her consent, shares her experience working to pass the TAKE IT DOWN ACT in congress. Her article is, “I was the victim of revenge porn. Congress can protect people like me

  • +

    Transcript

    Crimes Not Porn Podcast Full.wav

    Krys Boyd [00:00:00] Although reasonable people might disagree over whether porn makes the world a better place. The Supreme Court has made it clear that adults are allowed to access sexually explicit content if they want. What nobody has a right to make or to view, though, is content that was created or spread without the subject’s permission, or that exists because someone else experienced sexual abuse. To put it bluntly, revenge porn and child porn are not pornography at all. They’re not entertainment. They are exploitation. From KERA in Dallas. This is think I’m Kris Boyd. Changing the language around this stuff might seem less urgent than catching the people, creating and spreading abusive content, but my guest notes that the language we have to talk about these things can shape how we feel about the people involved. Jasmine Mitani is a reporter for the 19th, which published her article Their Crimes. So why do we keep calling them porn? Jasmine, welcome to think.

    Jasmine Mithani [00:00:57] Thanks for having me.

    Krys Boyd [00:00:59] Just to start with the very basics here, for something to count as pornography, which is a form of entertainment, it has to be consensual, right? All parties involved in its creation have to be on board.

    Jasmine Mithani [00:01:11] That’s absolutely correct.

    Krys Boyd [00:01:13] Okay. And the Supreme Court has ruled multiple times over the years that consensual adult entertainment is protected under the Constitution.

    Jasmine Mithani [00:01:22] Yes. There’s really only two exceptions. And that is obscenity, which there’s a really specific narrow definition for. And also, you know, child pornography is not protected because that it’s a crime.

    Krys Boyd [00:01:35] One place that things can get complicated is that you could have two nearly identical sexually explicit images of adults. If one is produced and distributed with the subject’s permission, it’s 100% legal, but the other is evidence of a sexual crime.

    Jasmine Mithani [00:01:52] Yes. And I think there’s also some nuance here, because there’s also images that were originally taken consensually but then are being disseminated non consensually. So for instance, if someone takes a nude photograph of themselves, sends it to a partner, it is meant to be private. And then that partner betrays that trust and shares it. That is also a crime now in nearly all states. I think there’s only one state that, doesn’t consider that a crime.

    Krys Boyd [00:02:22] Yeah. I mean, in the digital era, these things are often circulated on the same platforms, though, that host legitimate sexual content. Right. It can be difficult to know whether something a person is viewing was made and distributed with the permission of the subject.

    Jasmine Mithani [00:02:40] Absolutely. There are some companies now, you know, sites that host pornography have been trying to work on that problem through verification. Of all of the people who are on the platform, for instance, Pornhub does that. It is an imperfect process. There are still people that get around it. But several years ago, Pornhub purged the majority of its content and started requiring people to authorize that it was actually them and that it was consensual. Again, you know, there are people that work around that. There are still cases, but, companies have been trying to work on it, especially after some high profile, cases of nonconsensual imagery being disseminated on their platforms.

    Krys Boyd [00:03:30] Okay. I want to talk about what is sometimes called revenge porn. We will discuss in a minute why it should not be referred to as pornography. But first off, for folks who maybe grew up long before the digital age. Can you explain how it happens that images that a subject may have consented to taking in the first place, and may have themselves chosen to share with one or a limited number of people, becomes a weapon against them?

    Jasmine Mithani [00:03:54] Yes, and I think it’s useful to note that revenge porn has existed before the internet. Lots of people shared nude images, that they had received, consensually with other people, without the subject knowing that that would be shared. Now, it’s often a technique that’s used to harass, to intimidate, as a threat towards people and the people that who are often, the victims of this crime are overwhelmingly women. Also, it disproportionately affects LGBTQ plus people as well. And it is a way to wreak havoc on someone’s life by, you know, especially when so much of our reputation is tied on to our digital presence. There’s been research that shows people whose images have been shared without their consent online, completely retreat from online life. Imagine not being able to, have a prospective employer Google your name, and just have all of these horrific images come up instead.

    Krys Boyd [00:04:55] And this is to say nothing of people who did technically consensually share images, but but were reluctant to do so in the first place, felt pressured to do so.

    Jasmine Mithani [00:05:07] Right. That is also a situation, right. Where there is somebody is coerced into producing, images. That’s something that happens in relationships. Also, I think it’s worth noting a lot of people do this consensually in relationships. They do it privately. It’s something sexual privacy is not, you know, an inscribed right, necessarily in any of the legal codes, but something that people generally have the right to do.

    Krys Boyd [00:05:35] How often does this happen without the subject ever knowing their image has been exploited in this way, shared without their consent?

    Jasmine Mithani [00:05:43] Many times. These are part of targeted harassment campaigns. You know, many of the cases that I’ve read through, people, who are disseminating the photos are someone who, you know, especially in this case of revenge, which it’s not revenge. There’s nothing that anyone did wrong. But it’s usually seen as retribution for breaking up with somebody. So it’s honestly part of a pattern of abuse a lot of the time. It’s. This person is threatening them. They’re sharing it a lot of the times. When we think about it happening in schools, when images are being shared, a lot of the times, you know, it’s part of bullying. People see it, people bring it up, it becomes sort of this campaign of terror against people. And it’s often something where, you know, an abusive ex is using it to try to extort someone as well by saying, you know, I’ll take these images down unless you pay me, unless you have sex with me. It’s, there is the possibility that people don’t know about it, but it often fits into a larger pattern of someone trying to attack a person.

    Krys Boyd [00:06:56] You mentioned that the unauthorized sharing of intimate, explicit images pre-dates the digital era and the internet, but, I mean, it does seem to be much more common now, is do most analysts think that this is really, something that is evocative of our time?

    Jasmine Mithani [00:07:14] Yes. It’s something that, especially in the early aughts, completely took off, their entire websites dedicated to hosting these kinds of images and sort of reveling in this and, you know, making fun of people who tried to get their information taking taken down. Because one thing, is that a lot of the times, it’s not just the image that’s shared, it’s image and identifying information. It’s people’s addresses. You know, it’s not just, an image that, you know, somebody could happen to recognize someone. It’s, you know, their entire reputation being put out on the internet and then also disseminated so widely. One thing that is really difficult is that if something is posted on web, one website is often posted and shared on others, and, it can be really hard for people who are experiencing this to track down all of the different ways and different methods, and that their photos and images are being shared, especially because many of these sites have very different levels of, recourse that people can seek.

    Krys Boyd [00:08:17] So this can even get out of control of the person who initially intended to harm the victim. Right. They may be posted somewhere, and then other people can take that image and share it elsewhere.

    Jasmine Mithani [00:08:26] Yes, I think that is often the point, is that it is cruelty. It is to inflict harm. It is to make someone very scared. It is often, you see someone’s images or post along with, incorrect information, for instance, saying like they are soliciting sex or they are, you know, contact them at this number or things like that. So often, there is this element of trying to get strangers, to attack this person who might not even realize that it isn’t, consensual. And that it’s not actually this person, you know, posting, personals ad.

    Krys Boyd [00:09:05] And as you mentioned, you know, the term revenge implies that the survivor of this somehow deserves this kind of criminal humiliation, which is outrageous. What have you learned through your reporting, Jasmine, about how the phrase revenge porn affects the individuals, who are touched by this abuse?

    Jasmine Mithani [00:09:25] Yeah. So it’s incredibly, victim blaming, in order to say that this is revenge, it implies that someone did something wrong. And also one aspect of that, too, is that people are often blamed for taking these images in the first place, even though there’s no reason they should feel shame. It’s completely normal. It’s something many adults do. And one thing I also heard too, is that, you know, the word porn can be very confusing for people when it’s it’s very hurtful for people to have their abuse called porn, because that’s not what it was. It implies consent. When you say the word porn, it implies, you know, a production. And it’s something that they were not, willingly participating in. And also, you know, from the perspective of the adult industry, it’s not something they want to be, associated with. You know, they don’t condone this. One thing that was pointed out to me is, you know, sex workers or, you know, adult, film stars are people who have their images taken and disseminated without their consent. Many, many times. Like that is something that affects them very greatly. It’s something that is taken very seriously. And they do not want to be seen as complicit in this form of abuse.

    Krys Boyd [00:10:37] Sure. And for people who earn their living this way because they have chosen to do so, enough of this happens and it becomes hard to get a paycheck.

    Jasmine Mithani [00:10:47] Yes, absolutely. It’s something that, you know, it ties into. It’s a lot harder for people to make a living, when so much of their content has been stolen and shared for free.

    Krys Boyd [00:10:58] And by the same token, I guess, you know the porn half of the phrase revenge porn sort of whitewashes the actions of the people who share these images illegally. Because, as you said, broadly speaking, there’s consensus that adults can do what they like. And this is not an abnormal behavior necessarily for most adults. Suddenly the blame is off the person who is sharing this.

    Jasmine Mithani [00:11:22] Yes, absolutely. You know, it seems like people did it because they wanted to, 100%.

    Krys Boyd [00:11:28] Agree. One term that almost every advocate finds particularly problematic, Jasmine, is child pornography. And if we recognize that, first of all, no minor child is capable of providing consent to use their images for sexual gratification under any circumstance, then the thing that gets called child pornography isn’t even a real thing, right? It is always a reflection of abuse.

    Jasmine Mithani [00:11:53] Yes it is. The word child pornography is very confusing. It’s confusing to the public. It’s also confusing to juries because they know that pornography is legal. But then you add child in front of it and you’re basically saying like illegal legal and it can be very confusing. It’s also really damaging for survivors, to, you know, even have the implication that, when they were kids and they couldn’t consent that this is something that they were, a part of willingly, which is absolutely not the case. And it also, there’s a point here where it’s not even just the word child pornography. It’s also the way we talk about, crimes with sex and children. One thing that really stuck with me is at the National Center for Missing and Exploited Children. They run the cyber tip line, which is where people can, you know, share tips about, child sexual abuse material that they find online. So they can investigate is that they just call these crime scene photos. That is one thing that is particular about child, about child sexual abuse material is that, there isn’t is there isn’t that gray area where there is somebody like, I took a picture consensually and then shared it oftentimes for child sexual abuse material, it is visual documentation of, you know, as you know, one advocate told me some of the most horrific crimes that you can think of.

    Krys Boyd [00:13:20] As you mentioned, Jasmine, a lot of child welfare advocates prefer to refer to this content around children as child sexual abuse material. But this phrase, the sort of original phrase child pornography that’s actually been perpetuated by laws designed to eliminate the practice or criminalize the practice.

    Jasmine Mithani [00:13:41] Yes. And there is some distinction there where, you know, the language that is used by our child safety advocates, there is unification around, there’s unity around saying child sexual abuse material. But when they are in legal context, they have to use the word child pornography, because that’s what the statute says. That’s what the actual definition is when they’re charging people with crimes.

    Krys Boyd [00:14:05] What kinds of efforts have lawmakers, some lawmakers made to excise that term from at least federal statutes?

    Jasmine Mithani [00:14:13] Yeah. So the National Center for Missing and Exploited Children has been pushing to change that language and laws. There’s currently a, bill right now called the Earn It Act that way to change all of the references to child pornography, to child sexual abuse material. And, also, they have been taking it since, you know, the Congress Congress can move kind of slowly on these things. They’ve been taking it to states as well. And over the past two years, ten states have changed their state codes to use child sexual abuse material instead of child pornography.

    Krys Boyd [00:14:49] Here again, we have this problem, right, that the language around this might shape the way people think about it. How does the term porn or pornography applied to criminal exploitation? Put the focus on like the sexual gratification of the user as opposed to the exploitation of the subject?

    Jasmine Mithani [00:15:07] Yeah, I think a lot of people say that they feel really violated when this happens. You know, there’s a lot of research to show that when people’s images are shared, not consensually or, you know, even in today’s world with artificial intelligence, where, there can be, you know, fake images of somebody in compromising position shared. Just if somebody basically has a face, these images can be generated. It’s people say things like, I feel like I’ve been raped. Right. They often have the same effects. They have similar symptoms as people who have had, physical who have been physically, sexually assaulted. And that’s one thing. Why, you know, they specifically want to use words such as, like image based sexual abuse because it gets what it is. It’s an image, it’s sexual. It tells you a little bit about the content and it’s abuse. And that is the thing that was emphasized to me over and over again throughout the course of my reporting, is that we have to call abuse for what it is. Anything else makes it seem like it is more socially acceptable, or that it is, and it is also playing into the language of the abuser. You know, one word that often comes up when describing things in the system. The AI generated space is deep fake, and it’s plain out to me that, you know, that’s the name of the user on Reddit who started creating these. And do we really want to use the language of the abuser when we are describing things that have devastating effects on people’s lives? And it’s, you know, a really, expansive form of gender based violence?

    Krys Boyd [00:16:48] We’re going to talk more about deepfakes in just a moment. But I do want to say I first heard that alternative term child sexual abuse materials a few years ago, and it made so much sense to me. I have tried to use it ever since. I mean, it’s not like I’m talking about this stuff a lot, but I’m curious as to whether these updated terms are catching on in common parlance.

    Jasmine Mithani [00:17:11] Yes it is. One thing that I’ve found interesting, and, you know, part of the reason I started reporting on this story is because I didn’t know what to call them. You know, as a reporter, as a journalist. Particularly in the context of I felt like everybody was using different words. And I had known a little bit about, the revenge porn fight. And, I, through reporting on this article, learned a little bit more about the decision to use, child sexual abuse material. And one thing is that, people often need to be met where they are at as well. For instance, you know, image based sexual abuse. You know, people might not understand that word, but they know what revenge porn is. So a lot of advocates and what has been, what I think journalists should do is say, you know, like the non-consensual sharing of intimate images formerly known as revenge porn or also known as revenge porn. So because this has been in like the public consciousness for, you know, over ten years, it’s really hard to change language overnight. Although I want to make a point that a lot of researchers and a lot of, you know, victims, rights advocates and a lot of survivors do already use this language. It’s more, like legal codes. And also legal language is often much more specific because, you know, you just can’t write the words revenge porn in a law. You have to provide a lot of, definitions. But I think the media plays a huge part in, popularizing terminology.

    Krys Boyd [00:18:42] Okay, back to deepfakes. In the spirit of meeting people where they are, I think some listeners know exactly what this means and what this technology involves. Others may have heard the term but not know exactly how it can be used in this context. Effectively, it can be made to appear that sexually explicit images of someone exist, when in fact that individual may never have even posed for those images, regardless of the circumstance.

    Jasmine Mithani [00:19:07] Yes. So originally a deepfakes were taking the likeness of a person, usually a celebrity woman, and putting it on to an adult film actress’s body in while they’re doing sexual acts through technology. So I want to point out here, this is something that has been reported on by, Samantha Coy for for media that there’s exploitation in both ways here. Right. Two people have had their images taken, the adult film star, and also the person whose face is being superimposed on this body. Now, I think it’s worth saying that deepfake does kind of generally it feels like a catch all term for like ultra realistic. Stick manipulated media. A lot of the time it is like superimposing someone’s face or someone’s voice, over another, video in order to look extremely realistic. This is something we’ve seen with politicians as well. There is, you know, a deepfake of Nancy Pelosi that went viral. You know, that that is, another example of how they’ve been deployed. There is also the fake call that tricked a lot of people. That was Joe Biden telling people not to vote in the primaries. That was also deepfake audio.

    Krys Boyd [00:20:20] So are the platforms that distribute pornographic content for profit? Do they feel like sites like Pornhub? Do they feel like they are able to be effective in rooting out these deepfakes?

    Jasmine Mithani [00:20:35] You know, I did not speak with any specific representatives for this story, but one thing that has stood out is that, you know, there have been stories where, people have been able to report rather quickly. I think this is particularly recently after Pornhub made changes. With the verification, it is a lot easier to take down content. There’s still people who struggle. And I think a lot of people were incredibly harmed before these were put in place, where their images also particularly, some sites let you download videos so you could get it taken off the site, but how many people had downloaded it? How many times is it being re-uploaded over and over again? I think it’s hard to gauge how effective it is. I think it’s I would guess it’s certainly much better than it has been in the past, but it’s still difficult for people who are going through it.

    Krys Boyd [00:21:33] So as you mentioned, Congress can move slowly. The law can sometimes be slow to change. We do have laws that were written with technologies like Photoshop in mind. This is an older photo editing software platform. Could those same laws be applied to AI generated sexual abuse content?

    Jasmine Mithani [00:21:51] It really depends on the law. So for instance, the same laws that outlaw child sexual abuse or, you know, technically, in legal terminology, child pornography, do you mention computer edited or manipulated images? And they really were thinking about Photoshop, but that now, because that language has been sort of general, it can apply to AI generated content. There is an issue where a lot of these, you know, so-called revenge porn laws that ban, you know, the non-consensual sharing of intimate images. It’s banning it’s making the sharing of images illegal. And they’re often defined as like images that were previously shared consensually. Which does kind of leave open this space for, well, what if the image was made by a computer?

    Krys Boyd [00:22:41] Do you know whether civil law proceedings have had much success in getting compensation for people victimized in these ways?

    Jasmine Mithani [00:22:48] There, it has varied greatly. You know, there was a, in Texas, a jury awarded a woman $1.2 billion in damages after her ex-boyfriend circulated intimate images online after they were broken at, after they broke up with each other. I think, you know, that is something, particularly now as these laws have taken over, in many different states, there are remedies for people to, you know, that can’t make things right, but, it can help.

    Krys Boyd [00:23:21] You know, I have to say, reading your story, Jasmine made me think about the messages that we send to young people, teenagers who are just sort of exploring their own sexuality. And often I think baby girls are warned more often not to share these. I don’t know how often boys are explicitly told by authority figures that they have no right to do anything with images they might receive.

    Jasmine Mithani [00:23:46] Yeah, there’s a large education gap. And, you know, I talked to a researcher, Sophie Maddox. She, did her PhD thesis all on the effects of, you know, these, like, digital sexual forgeries on young, on adolescent girls. And it’s she mentioned, you know, whenever she talks to reporters, she gets to a point where they talk about the patriarchy, and how there can be a really large education gap, like there’s teaching people not to take these images, which I’m not sure how effective that has been. But there’s also, you know, telling people that this is not okay, especially young boys, especially also, you know, girls, young girls, also, you know, share images. You know, as a form of bullying. And it’s important to, you know, emphasize that this is not okay. It is, a crime. It is literally a crime when they’re underage. And that’s something that, you know, it’s unclear as a conversation for parents to have with their kids is it’s something that needs to be. Hot in school. There’s definitely a lot of room, to increase literacy around these topics.

    Krys Boyd [00:24:56] Is anybody talking these days? Jasmine? People who want to protect individuals from this kind of abuse about images that are entirely fake, fake body, fake face, but may contribute to a normalization of abuse. I’m thinking about the fact that now explicit images of children who have never existed might be created and sought out by individuals, who would otherwise go looking for images of real sexually abused children.

    Jasmine Mithani [00:25:23] Legislation is still catching up, I think, especially when it comes to, you know, I completely I generated, child sexual abuse material. And I think one thing that’s important to note is that, you know, all of these image generation artificial intelligence platforms, you know, some of them, they’re in their training data sets, has had child sexual abuse material. That’s what makes these possible. And that’s something that there needs to be much more accountability around.

    Krys Boyd [00:25:54] Absolutely. And it feels like there could be a setting coated in that just doesn’t permit these, platforms, to produce images that are requested but are not okay.

    Jasmine Mithani [00:26:07] Yeah. And I think, you know, that’s something that, people are very creative. You know, I, doesn’t feel responsible to share some of the prompts that people put. But there are ways where people have been able to get around, you know, saying child, and it’s a lot of companies aren’t necessarily prepared for somebody to be as motivated to, to create these camps of harmful images and don’t necessarily have the appropriate protections in place.

    Krys Boyd [00:26:39] There is a somewhat hopeful detail here. You know, that we’ve mostly moved away from this term sex tape, to be clear, these were typically stolen from people who did not intend for them to be public, often celebrities. But the implication was that celebrities perhaps deliberately made and released these things.

    Jasmine Mithani [00:26:58] Yes, it’s something that, you know, you don’t hear spoken about as much. You know, sex, tape it and, you know, even then, I think it sort of evolved into people saying the word revenge porn, which, you know, is not a term that emphasizes the criminal aspect of it, but people are sort of aware that it is nonconsensual, and now there’s non-consensual intimate imagery. And and that is one thing that does feel like culture has been changing. And, and I think it is useful to know that, you know, a lot of these survivor and lead and like, victim, victim advocacy groups really, use these terms already and it’s just kind of a matter of permeating them into the mainstream and also making sure that we’re not reinforcing, things. You know, especially I know, as a journalist, you know, headline counts can be really short. But, there’s an example of, you know, if we’re talking about someone, if we’re talking about a child who’s been sexually abused, we don’t say an adult had, like an illicit affair with like a ten year old. No such thing exists, right? That adults sexually abused a ten year old. And it’s really important to make those choices, in our language, especially when we’re reporting. And I think that is something that, you know, I do see change. You know, I think I see a lot of, you know, especially in tech reporting, I think people say, you know, nonconsensual sometimes they say non-consensual porn. But there’s at least that aspect where they’re making sure it’s noted that, like, hey, these people did not participate in this willingly, and it’s a crime and it’s hurting people.

    Krys Boyd [00:28:43] If you’ll forgive the awkward nature of the question, I’ll ask what some people may be thinking, which is for adults who want to use consensual pornography featuring other adults as a form of entertainment, is there anything they can do to ensure the images they are engaging with have not been generated at the expense of some innocent person?

    Jasmine Mithani [00:29:05] That is a great question, and some things that I can think of off the top of my head is, you know, investing in people. I think one thing where we’re kind of the most exploitation occurs is in like free porn. So I know a lot of people have said, advocate for paying for this. You know, it’s a form of entertainment, it’s a form of labor, and it deserves to be compensated. And that can be one way. It’s not, foolproof. But, you know, there’s when you follow specific individuals or specific companies, smaller companies as well, that can be one way to more directly support performers.

    Krys Boyd [00:29:44] We’ve been talking about calls for changing the way we talk about. Sexual images shared without the consent of the subject. Many people also think we need to change what we do for people who are exploited this way. More than a decade ago, Holly Toups was horrified to discover intimate pictures of herself uploaded to the internet by someone who wanted to hurt and humiliate her. Her suffering was compounded by just how hard it was to get platforms that featured the photos to remove them, and now she is an active supporter of legislation designed to protect and empower victims of these crimes. Holly, welcome to think.

    Hollie Toups [00:30:20] Thank you. Thank you for having me.

    Krys Boyd [00:30:22] So you want to help other people, in part because you have lived this nightmare yourself. Will you share as much as you feel comfortable sharing about how you first learned that pictures of you meant to stay private ended up posting in very posted in very public places online.

    Hollie Toups [00:30:38] Yeah. So like you said, about a decade ago, I was informed by someone in my community, that there was a website with nude and topless photos. And they claimed that my photos were on it, which, I thought there was no way. You know, I didn’t consent to this, so it clearly had to be, a mistake. And so they they gave me the website, and when I looked it, it was me. There was topless photos of myself, and it was quite shocking. And then it was shocking to scroll through and see just how many women were posted there, alongside me in my community. And, you know, just so many thoughts go through your head as to just really how that happened and how that was something that was allowed.

    Krys Boyd [00:31:30] Yeah. And in addition to having all this out there, there were like terrible comments and threats attached to the pictures.

    Hollie Toups [00:31:38] Yeah. So there was, I can’t remember if it was right away or if it was something they added later, but since it was, user generated, they, people were at some point where allowed to leave comments. And they were, you know, they ranged anywhere from, from the usual, degrading comments that, that people say about women online to actual threats, and sometimes sharing personal identifying information about, you know, myself and the other women where we worked for our family was, you know, people’s addresses. So that was, you know, that added obviously a lot more concern, and fear to all of that.

    Krys Boyd [00:32:18] So your first step was to contact the website and say, please take these down. What was their response to you?

    Hollie Toups [00:32:25] Yeah. So I mean, I, you know, was kind of naive and I just thought, oh, this was a mistake. They thought, that I consented to this and I just sent them an email. You know, like, hey, these were posted without my consent. Can you please remove them? And I was really confident, that they would. And, you know, they responded that, then we have to remove them if I would just enter my credit card number. And I was, I was 100% sure that had to be illegal. Because. You know, they. There were my photos. And then now you’re trying to, you know, to extort money from me, to remove them.

    Krys Boyd [00:33:10] Did they give you some reason why they were charging a fee for this?

    Hollie Toups [00:33:15] They did not. The conversation was pretty short. Because I, I honestly became enraged. And I just was like, you know, paraphrasing. I’m not going to pay you to remove photos, post it without my consent. But I will, you know, seek out a lawyer. And so that’s that, you know, was one of the things that I did next.

    Krys Boyd [00:33:38] Yeah. So you contacted a lawyer for help, but then you ran into the challenges of section 230 of the Communications Decency Act. Remind us what section 230 does.

    Hollie Toups [00:33:51] Yeah. So I first went to law enforcement because, you know, after I was like, I’m going to see a lawyer, I realized that I didn’t really have the money, to do that. And so I was like, well, this has to be legal. So I went to law enforcement. And they didn’t have the tools to help me at the time. You know, they, they it was still something so new. So first I had to kind of educate them on what it was, but then they they really didn’t have the tools because of the sophistication of the internet. And so then, yeah, I went to an attorney and, you know, I’m, I’m not an attorney and I don’t have, all the details on section 230, but the gist of it is, is that there is a clause in there that protects websites from, content posted by users to where the website would then not be responsible for that and that they would have protection under that clause. And so that’s that’s what this, that’s what both the attorney and law enforcement. And then at some point, the website, bragged about having protection from that.

    Krys Boyd [00:35:00] Yeah. I mean, just to note, section 2.3 was enacted in 1996, which, like given the rapid changes we have all lived through in the digital era, might as well have been a century ago, but effectively it, the idea was that, online platforms shouldn’t be held responsible for what, like random individuals posted using their platform. How did you finally, but not really, finally, temporarily get your photos removed?

    Hollie Toups [00:35:28] Yeah. So, you know, as time started to pass, the photos started to spread to other websites and be shared. And, I was not really getting anywhere between law enforcement and, seeking assistance from an attorney. So, friend of mine suggested that I see a private investigator that maybe they could work somehow to get the photos removed. And so I did that, and, you know, he was just as shocked, that this was happening as I had been. And so he helped me, you know, he agreed to help me and, started digging into it and reaching out to the websites. And after he was on the case for a little bit, he was able to get my, my pictures removed from one of the sites.

    Krys Boyd [00:36:18] One of the sites. But I guess by that point, I mean, first of all, were you having to like, track these things and interact with these photos that were obviously a source of great distress for you after they’d been shared? Did that all fall to you?

    Hollie Toups [00:36:35] Before I. Before he took my case. Yes. I, you know, the easiest thing that I that I can describe it is, is I was I would wake up, fix myself coffee and go to my computer and find where I had been posted and where I was and what was going on, and, and just try it. It was kind of like a game of whack a mole. They were just, you know, constantly coming up and going down. But when he took my case, his team graciously, you know, said, we’re gonna we’ll track this every day. You don’t have to worry about it. And they would just constantly look for them, reach out to, websites and try to handle that for me.

    Krys Boyd [00:37:19] How did you manage your emotions during this time? I mean, this hasn’t happened to me, but I can imagine you would feel like just walking around the grocery store, like everybody knows this thing. Everybody’s aware that this is happening to me, whether or not that’s real. It seems like it must feel that way.

    Hollie Toups [00:37:36] Yeah. You know, in the beginning, I was, I was very fearful because it was such an invasion of privacy. And I felt, you know, not knowing who did it. I everyone was kind of under suspicion. And then I come from a really small community, and so people were talking about it. People were talking about the website. People had seen the photos. People would come up to me. You know, like they had seen me on this website. So their, in their mind was this lack of boundaries. And so they assumed a lot of people assumed that I wanted to be on there and that I wanted that attention. And so it was, you know, and I think the biggest fear for me was at the time I was work. I was a, a volunteer with Casa, and I was appointed as a guardian ad litem to youth in foster care. And I was worried about their safety because, you know, if people are approaching me with no boundaries and saying these horrible things about me. I was worried about losing my job because some of the girls that I had made contact with that were posted on this website, they had lost their job over it. And so there was a lot of fear and uncertainty. And just like, you know, if this can happen and there’s no, accountability for them, you know, where do they stop?

    Krys Boyd [00:39:08] So, Holly, you are an ardent and public supporter of a proposal called the Take It Down Act, which was introduced about a month ago by this bipartisan coalition of federal lawmakers. The act would do two essential things. What is the first component?

    Hollie Toups [00:39:25] There are a lot of laws in different states, criminalizing and also offering civil penalties, for posting non-consensual intimate images. But one thing, that this bill does is it would make more of a uniformed law. And not only, would it have this, the criminal penalties, but more importantly, the one thing that I struggled with, as I mentioned, was having the images removed and then they continued to spread. So in this law, it would require that the websites take down these images after receiving a valid request from the victim within 48 hours. And so that would have changed so much for me. Because, you know, we spent a probably a good part of a year just going to all these different, websites and asking to take them down. But this law also encompasses not only revenge porn, or nonconsensual intimate images, but also AI generated images, which is now something that has started to, you know, become a problem. And so I think, you know, having, having there are some laws with just the civil, penalties, but that’s really hard for victims. It’s expensive. Sometimes it can cause, you know, more extensive trauma, going through a civil case. And so I think this law, you know, covers a lot, for victims that some of the other laws don’t. And then it also is written in a way, to criminalize the, you know, knowingly publishing, but also making an attempt, to not harm lawful speech.

    Krys Boyd [00:41:24] It seems like this is a technologically simple task for these websites, where these photographs and images are published without the consent of the subject. Has there been a lot of resistance from social media companies and other platforms?

    Hollie Toups [00:41:43] So in my case, yes. You know, we were not able to get a response from, the, the host and the website, owners until we actually, you know, filed in court, which is not something that everyone has access to. And then just hearing other victims stories, reaching out to different, platforms, you know, being posted on Snapchat and Facebook, it’s increasingly difficult, to even get a response, much less for them to actually remove the photos. A lot of times there’s just no response at all. And so, yeah, surprisingly, there hasn’t been a lot of accountability to websites in, in responding to this and acting, to take these images down. And so I think that this, you know what, hold them accountable as well. And then if it’s something, you know, sometimes you hear them say, well, we don’t have it and there’s not a load on the tools. Free speech. Well, this bill would give them, less of an excuse. It’s there. You’re required and you have 48 hours.

    Krys Boyd [00:43:00] The Take It Down Act bill has support from lawmakers in both major parties, which often bodes well for the passage of new legislation. Are you optimistic that this will make it through Congress?

    Hollie Toups [00:43:13] I am, I am hopeful, because, you know, having that support from it’s not a Partizan issue, but sometimes things end up that way. So having, you know, so much support, from all of the lawmakers and then hearing how, you know, I was a part of of helping pass a law here in Texas a decade ago. But things have changed, and we have to try and keep up with that. And then a federal law would help, victims, you know, across the country. And so I think people are starting to understand better. Back when I was advocating, for the bill in Texas, it was still sort of new. And so there was a lot of education behind it. And I think people are starting to understand, you know, celebrities have been affected by this, and that has obviously brought a spotlight on it. But I think, you know, it’s it’s a really good effort to, you know, provide protections for victims and requirements to the website, to protect the victims across the country from these bad actors.

    Krys Boyd [00:44:26] Holly Toups is a victim’s advocate and a supporter of the federal Take It Down Act, which is before legislators as we speak. Holly, thanks very much for making the time to talk about your experiences. I wish you well.

    Hollie Toups [00:44:40] Thank you so much.

    Krys Boyd [00:44:42] You can find us on Facebook and Instagram. Just search for KERA think. And if you’d like to subscribe to our podcast, it’s available for free. Wherever you get your podcast, you just need to search for Cera. Think as well. You can find the podcast at our website, Think.kera.org. Again, I’m Krys Boyd. Thanks for listening. Have a great day.