#126 Laila Micklewait - Trafficking Hub, Censorship or Crusade
Why Pornhub Failed Victims: Trafficking Hub's Laila Mickelwait Reveals the Truth
In this episode of Chatting with Candice, Candice Horbacz dives into a crucial discussion with Laila Mickelwait, founder of Trafficking Hub and the Justice Defense Fund. They explore the disturbing issue of non-consensual and illegal content on Pornhub, including child abuse and sex trafficking. Laila recounts the origins of Trafficking Hub and the challenges of regulating adult content while protecting minors. The conversation sheds light on Pornhub's negligence over the years, its flawed business model prioritizing quantity over verification, and the industry's fear of speaking out against major players like MindGeek. They also discuss potential solutions, such as implementing paywalls to prevent underage access and the need for stringent policies across all platforms.
Timestamps:
00:00:00 - Intro
00:02:54 - Trafficking Hub Movement
00:06:25 - Mission of Trafficking Hub
00:10:02 - Pornhub vs Social media companies
00:20:02 - Taking responsibility
00:22:42 - Getting Blacklisted
00:46:39 - Stricter copyright laws
00:54:56 - Intro and exit interview
00:59:23 - Is Trafficking Hub trying to ban porn?
00:04:30 - Ending
Checkout Laila's Instagram: https://www.instagram.com/lailamickelwait/
Checkout Laila's book: https://takedownbook.com/
Follow Candice Horbacz on socials: https://link.me/candicehorbacz
Support The Podcast on Patreon: http://patreon.com/candicehorbacz
0 (0s): They say it's one thing to be raped, it's another thing for that to be filmed and then globally distributed for the rest of my life. 1 (9s): What is like the overall mission of Trafficking 0 (11s): Hub? Trafficking Hub is solely, specifically about holding porn hub accountable for, they found dozens of illegal videos on the site within, she was found in 58 videos being raped and abused on the site. They were begging for those videos to be taken down. They say, even after I'm dead, I know that people will be getting money and pleasure from the worth moments of my life. 1 (39s): Hello everybody. You are listening or watching chatting with Candace. I'm your host, Candice Horbacz. Before we jump into this podcast, I just wanted to say thank you to Stardust for all of those cups of coffee. And if you want to donate to the podcast, you can go to chatting with candace.com and click that little link that says, buy me a coffee. It all goes back into the show and into production and it is very much appreciated. This week we have a very, we have a very dark topic, a very hard topic, a very important topic. This is not a trigger warning because those don't frigging work, but this is just set and setting. Make sure that there's no littles around. Make sure that you have your earbuds in. 1 (1m 19s): It is explicit, it is hard to listen to, but again, it's very important. And this guest is I think, very often villainized and scrutinized. And I was a little bit nervous to have her on the show because I've just seen the hit pieces written on her where she's depicted as somebody who is anti-sex work, anti-porn and has an ulterior agenda I wanted to have her on to find out for myself. And I think that you'll be pleasantly surprised with this conversation. I think that she, what she's doing is very important and she's actually creating a lot of change that is needed within the industry. And this is coming from someone who is a freedom Maximalist. So know that going in. 1 (2m 1s): I think this was a really important conversation. I hope you do too. Please help me welcome Layla Mickel. Wait, hello Layla. Welcome to the podcast. I'm really excited to have you here. 0 (2m 11s): Thank you so much for having me. I'm excited to talk to you. 1 (2m 14s): Yeah, I'm sure that this conversation is probably going to ruffle a lot of feathers. I believe your coworkers are friends with Benjamin, right? Benji Nola, 0 (2m 25s): Yeah. We, I actually work with the Justice Defense Fund since very late 2020. I launched my own nonprofit organization called the Justice Defense Fund, but I did work with Benji and the team for many years prior to that with Exodus Cry. So yeah. 1 (2m 42s): Yeah, he's wonderful, wonderful guy. He's, yeah. Yeah, I think he gets a bad rep out there. Yeah. I guess let's jump into it. You are ver you did you f found you were the founder of Trafficking Hub. 0 (2m 55s): Yes. Okay. The Trafficking Hub movement. Yep. To hold porn hub accountable. Okay. 1 (2m 60s): So let's get into what is Trafficking Hub first off, and then what's the mission of Trafficking Hub? 0 (3m 6s): Sure. Yes. So the, the story of the evolution of Trafficking Hub is told in my new book that's coming out, or it's out now, goodness gracious, as of yesterday, take down inside the Fight to shut Down PornHub. But the evolution of the Trafficking Hub movement started unexpectedly in early 2020, really? February 1st, 2020. In the middle of the night when I was rocking my baby who was crying at all hours and thinking about a story of a 15-year-old girl who had, I had recently read days after he was born, the story of a 15-year-old girl who was missing for an entire year. And she was finally found when a porn abuser tipped off her mother that she was found in 58 videos being raped and abused on the site. 0 (3m 53s): And they were able to locate her and, and help her. And there was another, you know, this was at the very end of 2019 when these headlines were making news. The London Sunday Times had done an investigation into PornHub and they found dozens of illegal videos on the site within minutes, even children as young as three. And so I was up that night thinking about these stories and a question popped into my mind, which was, how did this happen? Because this is the world's largest and most popular porn site. They have this global brand that everybody seems to trust. 0 (4m 33s): And I decided to test the upload system for myself. And I learned what millions of people already knew because at that time they had 6.8 million uploads to the site in 2019 that had gone through the exact same process that I did. And that was that all it took was an email address to upload user generated homemade sex videos to the site. And because of that, the site had become infested with videos of real sexual crime. And so that kind of just was so alarming, the implications of that. I think at the time they had almost 11 million videos on the site and 40 million images. 0 (5m 17s): And just to think that all of those videos were uploaded without verifying that an ID to make sure that they weren't children or verifying a consent form to make sure they're not a raver trafficking victim. And that the site was full of awash with illegal content and felt like, I have to sound the alarm on. This is not okay. So inspiration, burst of inspiration started the hashtag Trafficking Hub and you know, a Twitter account with a few thousand followers. And people started to pay attention. And you know, one follower turned it, turned the hashtag into the logo of PornHub Trafficking Hub and kind of used that and sent that. And another follower said, oh, you should start a petition. 0 (5m 59s): And just did that. I, I wrote an op-ed that actually helped a lot. And I copied and pasted that into the petition. The petition went viral. And today, you know, on that petition we have 2.3 million signatures from every country. But that was the genesis, like that night and that kind of organic unexpected moment was what spiraled into the Trafficking Hub movement. 1 (6m 23s): So is the mission just to go after PornHub? Like where are you with all of that? 'cause I know obviously Visa, MasterCard pulled out now they're, I think, only taking crypto and we'll, we can get into like the evolution of how their company has changed since then. 'cause it's, it's been a drastic change for that company. So does it end with PornHub or what is like the overall mission of Trafficking Hub? 0 (6m 46s): Yeah, Trafficking Hub is solely, specifically about holding PornHub accountable for the global monetization and distribution of criminal content, including child abuse, rape, sex trafficking, and all forms of non-consensual content image-based sexual abuse. Which, you know, many people know that as revenge porn, but you know, we don't like to call it porn because it's not porn, it's criminal content. So that is the mission of Trafficking Hub is to hold PornHub and ex its executives accountable really to shut down PornHub. Like that was the call to action when realizing how much of that content was non-consensual being monetized for profit, knowingly. 0 (7m 30s): And I think like the details that are in the book, I think those are really important for people to understand why a call to shut down the site. It seems severe, right? It seems extreme in some, you know, to some people. But when you realize the depths and you peel back the onion layers of complicity and knowing complicity of PornHub and its executives in doing this and destroying and shattering the lives of countless victims, it's like the only response feels like severe actions. You know, severe harm deserves severe consequences. And if we're ever to really bring true justice to victims and then be a deterrent for other abusers, so they feel like this is not the cost of doing business. 0 (8m 11s): Like there will be a real consequence, the call to action is to shut down PornHub. And, but I think what's important is that it's not enough to just hold one company and its executives accountable, although it's important to bring healing to victims and closure to have accountability, but to implement policies to make sure it doesn't happen again. So you know, that specifically for Trafficking Hub and the work that I'm doing is we're calling for mandatory third party age and consent verification for every individual and every user generated porn video on every site that's distributing user generated porn. 0 (8m 52s): So basically like 2, 2, 5, 7, but applied to modern pornography distribution. 1 (9m 0s): So you're, are you suggesting that would just be for the people that were uploading or also the consumers? 0 (9m 7s): So this is for people who are uploading or in the video, so, okay. You know, verifying the uploader because you know, at, in 2020 it, people could just anonymously upload even using A VPN. So not only just an email address and a fake username, but they could even mask their location with VPNs. So complete anonymity and you know, they had a porn hub had developed a mirror site on the dark web even for further, you know, privacy and anonymity in viewing that content. But this is focused on those who are in the videos. Although I, you know, there's a wave of concern right now and legislation and action to protect kids from accessing that content. 0 (9m 52s): And I think that is a very important initiative. 'cause I think adult content needs to be for adults and of adults. And so, 1 (10m 1s): Well, it's an interesting statistic around that. 'cause I, I definitely wanna get into that because as of now, PornHub specifically is not allowed in eight states. So it's Texas, Utah, Mississippi, Virginia, Louisiana, Arkansas, Montana, and North Carolina, which to me is kind of silly because with A VPN, it's, it's an easy workaround. And then we're also only targeting these really big main quote, mainstream adult companies. And what's, there's a, there's a lot of edges to this. So one is those, like it or not, are the safer companies out of all of the companies, if you go to like these small dodgy ones that are, a lot of 'em are in Florida, a lot of 'em are in like these small towns. Those are, they are wildly more dangerous, unsafe and not following protocol, paperwork, legislation, all of that. 1 (10m 49s): So you're actually punishing what is mostly the better actors of the group. And then you're kind of like, you're forcing, you're forcing a lot of that underground. So it's like you're putting the spotlight on not necessarily where I think the most change could happen by doing that. And then I think it's also a false sense of protection when it comes to minors that are consuming content that is not for them. Because you think now that it's banned in your state, well now my kid's safe, I can give them this device. It's not, they're gonna outwit you every single turn that they can because they're younger and they're more privy to technology technology. So I don't see that, I don't see that as like a, as doing anything honestly. 1 (11m 28s): It's like parents should have the responsibility to know what your kids are doing online. You can install apps, don't give them devices that they're not ready to handle. And I think that's where it ends. Like I'm, I'm team like small government. The less regulation, the better. Less, 0 (11m 43s): Less less regulation. Yeah. Yeah, I mean there's definitely this huge debate around that user side of things. And you know, for my part and the work that we do at the Justice Defense Fund, we're solely, you know, advocating for, and our work is around, and Trafficking Hub as well is around those who are in the videos that are being consumed versus those who are using it. But, and I know there's many different mechanisms for protecting kids from viewing it and lots of debate. I mean, there's currently, you know, they're going to the Supreme Court to even find out is it constitutional to have these particular protections in place for users. 0 (12m 25s): So, right. I, yeah, I mean I, I don't know, I don't know what the best way to protect kids from viewing necessarily is because there's layers of protection. You know, there's also, you know, a lot of people say it's, it's only on the parents, but then I also think about kids that don't have available, or, you know, parents who are just so busy putting food on the table, they, they don't have time to be monitoring their kids' use or kids in foster care, like v vulnerable kids who may not have that oversight. And so I guess, yeah, it's an important conversation and like, I'm so open to hearing all of the different ways. 0 (13m 7s): I know people talk about device level verification for users. 1 (13m 12s): I know get you get a flip phone, you know what I mean? I, I, yeah, I challenge that. I no parent is too busy and if you are in a bad socioeconomic stance and why your kid have an iPhone, you know what I mean? And then another interesting, interesting statistic, it's over 80% of children that first watch a piece of explicit content. It's actually not on a porn site. So it's on a social media platform. Yeah. So again, like when you, when we have things that kind of come in as the solution to a problem, and I very much think it's a problem. I do not think that kids should be watching this, like to make that obviously clear if it's not. But when you focus on the problem, what that is supercharged, because pornography is supercharged for a lot of people, it's still very taboo. 1 (13m 53s): And you're like, well this is the bad guy, it's only PornHub, right? Well, we're not looking at the other places that are actually the real danger for children. So I pulled up numbers for 2020 just because a lot of those were inflated because of people were stuck at home for so long. So while I obviously am on board with making sure that content that is uploaded is consenting adults, I think that is very important. I don't think you should ever have a site where you're putting explicit content and there is no requirements. That just doesn't seem, that doesn't seem logical. But Facebook alone, and this is what they reported, so this is probably a much smaller number than what is actually on the site, was 21.4 million cases of child exploitation content. 1 (14m 37s): Instagram was 1.4 instances, TikTok was actually super low at 22,000 and PornHub was 13,000. So when you look at this and you're like, okay, well PornHub is the problem. We're talking 13,000 people compared or ca cases versus 21 million. So yeah, I guess what are, what's why Facebook 0 (15m 2s): Accountable? I think that look, all of these platforms that are enabling or distributing this content need to be held fully accountable to the full extent of the law. A hundred percent. You know, I, the genesis of Trafficking Hub was a discovery of the problematic nature and the complicity of PornHub and its owners and what was going on. And I think one of the things that we uncovered in this investigation and kind of this movement of kind of peeling back the UL layers specifically on PornHub, was they went for over 13 years not reporting any instances, not a single instance of child sexual abuse material that they were aware of. So there was a Canadian parliamentary hearing where, you know, under oath the leaders of the child protection agencies in Canada and the United States, that would be the, the clearing houses for this content where they would be reporting. 0 (15m 53s): They testified that PornHub hadn't made a single report to them since its genesis until after everything started to go viral in 2020. And they were actually, you know, hiding child sexual abuse material that they were aware of from authorities for over 13 years. And you know, for me, I think of, well if they were reporting and it's actually illegal. So in Canada there's mandatory reporting laws or companies that know about child sexual abuse material and it's, it, there's actually criminal penalties including up to five years in jail for not reporting and they weren't reporting. 0 (16m 33s): And even here we have laws about mandatory reporting. And I think about, you know, how many children could have been rescued out of situations of abuse if they had been doing what they were supposed to do. So, you know, as far as numbers go, I don't think we really know the scope of the child sexual abuse, but also the non-consent piece of it where, you know, it might have been consensually recorded but also uploaded non consensually and, you know, adult rape trafficking and all of that, that was also proliferated on the site. But I would never make excuses for any other company, including Twitter, who, you know, distributes unverified content. 0 (17m 17s): I think, you know, my hope is through this effort to hold PornHub accountable, that we cause a ripple effect that Im impacts all internet websites that, you know, causes safeguards that causes accountability and for them to care more about what is being distributed on their sites. And so my hope is that, you know, we'll see not only PornHub changed and held accountable, but we'll see all the other sites as well. And so we'll make the internet itself a safer place, is kind of the, is the goal at the end of the day. 1 (17m 57s): So as far as the non-reporting goes, is there, is there a way to know that they, they were aware of the content that was there prior? 0 (18m 7s): Yes, yes, they were aware because there's so many instances and evidence of even victims who were reaching out emails that they were begging for those videos to be taken down. And they were ignored by police. They were ignored by victims in, you know, many cases were victims were saying over and over again that they would be hassled for, you know, proving that they were a victim in the video. So, you know, they weren't requiring an ID and proof of consent to upload the content, but then when a victim would be begging to take it down, they would be hassled to prove that the prove that this is you prove that it's non-consent, prove that you're underage in the video. 0 (18m 47s): And you know, in the meantime, 5 million users an hour had the opportunity to be able to download that content. 'cause they actually had placed a download button on every single video and moderators from the company came forward and I said, why would they have a download button on the content? And they said it was because the business model was that they just needed more and more and more content to be able to drive traffic to the site to sell 4.6 billion ad impressions on the site every day. So the more content they had, the better. And it didn't really matter that much what's in the content, it's just they needed URLs and tags and titles to be picked up in Google to drive traffic to the site. 0 (19m 29s): And so they wanted people to download and make compilations and re-upload as long as it had a different title, as long as it had different tags. Like that was what was important so that they even had a download button to be able to kind of recirculate that content. And for victims it was became a perpetual nightmare of just trying to get that content down and then it would get put back up and then trying to get it down again. And, you know, this was the nightmare for so many victims who came forward. 1 (20m 2s): Yeah, what's really interesting is when, I remember when all of this was really picking up a lot of steam, it was all over my Twitter feed at the time. And so many people had either been reposting it and signing the petition or had like so much criticism because for some reason when you're in the industry, they feel I, this is my guess, it's because it's constantly so attacked already from the outside that they feel like they can't scrutinize their own. Because then it gives them, it gives them less credibility, but it's actually the opposites, like by not calling out bad behavior, you're actually proving everyone else's bad faith in the industry itself. 1 (20m 43s): So it's kind of your, it is, it is your responsibility to say something if you see something or something that's not above board when you are within set industry. And I remember people coming to the defense and they're, you know, they're like, oh, you know, 'cause these numbers are real on the other sites. Like there are, there's, as it as the reporting goes, there's statistically more illegal content on Facebook than there is on these tube sites. But the difference, we are actively reporting it and they're trying to do something about it. Where this other company entire business model was you can just upload anything with no connect. Remember, I don't know if it was PornHub or if it was someone that worked there, sister companies, but they were saying, no, we verify all forms. 1 (21m 28s): We do the 2 2 5 sevens and we do an ID check. Then I responded and I was like, well actually that's not true because my content that is pirated and myself, you didn't know like, and you, it could have been, you don't know when I shot it. Yeah, you have my ID not for this particular thing. You don't know, again, like the, so there are question marks. So if you did that with me, I know you're not with like Jill from Louisiana who has uploaded and that there were safeguards there 'cause there definitely weren't now, which is good. So that's a huge step in the right direction. Where do you see the industry of tube sites going? 1 (22m 11s): Because like I For paywalls, I think that that is one way to that the person on the other end is 18, right? You have a credit card that you're usually 18 to have just like an extra barrier. So it's like one, it's not proof, but it's something. So I do think that there should be paywalls. I don't like being scroll and just see something explicit and I come from that industry, right? That on my page, like I unfollow people that do that 'cause I'm like, that's not, do you think that there should be a paywall? Like where do you see the future? 0 (22m 44s): Yeah, well really quick to your previous point too, and then I'll go there as well. But you know, one of the things that's detailed in the book is this kind of evolution of so many different people coming together and some of the most important people that helped over the last four years were in the porn industry. Or actually, you know, porn performers who had said, like you just mentioned, that they were spending hours a day scouring mind geek's, tube sites trying to find their stolen content to, and 1 (23m 15s): They don't take it down still, even with 0 (23m 17s): Take down, right? And then it would get re-uploaded the next day. Yeah. So like you, I I do tell a conversation I had with early on with a porn performer. She's like, shut the fuck, shut them the fuck down. I'm so mad. Like I, I'm so sick of MindGeek doing this. And they built their empire on stolen pirated content and you know, they were, as they were looking on U porn and porn tube and Tub eight, 'cause you know, they own like so many of these porn tube sites, but on PornHub, you know, they were, they were finding clearly illegal content children, rape victims, unconscious, I mean unconscious to the point where their bodies are totally limp. 0 (23m 58s): They're not pretending they're totally limp. The perpetrators are touching the eyeballs of, you know, to prove that they're unresponsive. Things like that. And, you know, sending me these links because they're as disturbed by it as anybody else and they don't want to be associated with with it. But one thing I did notice was, you know, a lot of them were coming privately to me to help and to express support and hundreds if not thousands of them were actually signing and sharing the Trafficking Hub petition, probably under some anonymous names as well. But they were afraid of speaking publicly against MindGeek because of their control over the industry. 0 (24m 41s): And they felt like they could be blacklisted, you know, they could be, you know, their career and their livelihood could be destroyed if they went against the dominant company. And so I think there was a fear, but you know, to, 1 (25m 0s): But that's a day long time. So I would, and I just saw so is in Montreal is everything is kind of decent, one director that you're hiring, yada, but no one's saying anything. 1 (25m 40s): And finally I was like, enough is enough of views speak up in the industry and I still, if you see something and it, I believe that you're or scared that you're gonna lose your, when you say something. 1 (26m 28s): Hmm. 0 (26m 28s): Yeah. And there is, I don't know if you've noticed, but there's some lawsuits recently filed porn performer, I can't recall her name right now, but she was exploited that she said she was forced into doing, you know, violent sex acts that she didn't consent to doing. And, and MindGeek was responsible. They were the ones that were shooting, you know, organized shooting that was their production company and she's suing them now. I think it's, you know, tens of millions of dollars that she's suing for. But yeah, that's, you know, it, it, so MindGeek, you know, when all this began really MindGeek was a true villain to not only the victims, but also to those many of those in the porn industry who saw MindGeek as this exploitative and just entity that actually had, you know, destroyed the traditional porn industry and put out everything out there for free and, you know, reluctantly complying with the system because it was the only way to get seen. 0 (27m 31s): You know, they're like, I, I have to get my videos views and because PornHub dominates views, if I don't put my content on PornHub, I'm gonna get, I'm just not gonna make it. And so like, they felt like they were even almost forced to participate in the system that they didn't even agree with. And so there's like that layer of it as well. But you know, again, so many of the allies that helped in this whole effort were even employed by MindGeek. You know, I consider some of the whistleblowers who came forward now after so many dozens of hours of conversations and even meeting in person and that I would consider them friends. 0 (28m 13s): I have so much compassion for their situation. You know, one of them that I detail in the book who came forward to help, you know, was just an very underpaid employee in Cyprus and he was sexually abused as as a child. So it was very traumatizing for him. They had to watch, they were reprimanded if they didn't watch at least 700 videos per eight hour shift. Some of them were watching up to 2000 videos per shift where they just have to like click through, just click through, yeah. To approve the video. So they had, so this is what we discovered, so like peeling back the onion layers of MindGeek. 0 (28m 54s): Like, and this is how I kind of, when you in the book, it's like you go on this journey of uncovering discovery with me as you meet these, you know, moderators and people who are coming forward, but you know, he, he produced documents, internal documents with the schedules, with the names of the moderators. This, you know, on the outside to the public, PornHub was defending themselves saying, we have a large and extensive team of human moderators that is viewing every single video before it's uploaded to the site. Now, we'll go back to that in a second, but that wasn't even true. Like they had 30 moderators for all of mind geek's tube sites. So not only porn hu but u porn, red tube, X tube tube eight, you know, gee tube, all of the tubes, right? 0 (29m 41s): 30. And they worked 10 at a time on eight hour shifts. So think about bathroom breaks, cigarette breaks, and they were tasked with reviewing these millions of videos, millions of images, and they just skipped through them. They had the sound off and they just skipped through them. But they would get, first they would get uploaded live on the site, right? So they weren't putting a queue where you'd have to wait like two days and then get no, they all went live. And then after that they would just be skipped through by these moderators. And he said, I know moderators that approved very young children on the site because they're just, they're not even watching them. 0 (30m 23s): They're just like, they have to meet a quota. He's like, we have to meet the numbers and then I, you know, press him and kind of, not, not in an aggressive way or anything, but just like, give me more, tell me how this works. And I said, well, how can you figure out like who's 18 and who's not? Like, I mean, even a pediatrician can't tell who's 16 and who's 18. Like think about like a boy or girl. Their body at 16 is almost identical to what it looks like when it's 18, right? So like, you can't, it's, i it's literally impossible to dec decide to decipher. So they're guessing. So he is like, well we would just, you know, if they were wearing nail polish or, you know, yeah. 0 (31m 6s): So like they were trying to decide if it was too young of a person, whether like he, one of the things he said was like, are they wearing nail polish or what does the room look like? Or if they cried too much, like he's like, if they cried too much, you know, if they didn't cry enough or whatever, they wouldn't consider it rape. He said, at the end of the day, we're just guessing who is rape. But then you think about rough sex, right? Like how can you tell the difference between rough sex and rape? Or like, so much of it was consensually uploaded, like consensually filmed, right? So you, it's, it was just mind blowing, right? But they knew it. 0 (31m 46s): And, and they only had, and so like going back to Facebook, like Facebook has 15,000 moderators. PornHub had 10 per shift, 30. And they're lying to the public saying they have a large and extensive, but then let's go back to their statement that said that they view and approve them. Okay, well that's worse because there's kids that are on the site that are like 12 years old. I mean, there is a, a case in Alabama of a 12-year-old boy who was drugged, he was overpowered, he was raped in 23 videos that were being sold on the site. And think about a drugged overpowered 12-year-old. So they're viewing and approving those videos is what they're saying. 0 (32m 26s): Like there's toddlers, three year olds right on the site. So they're viewing or unconscious women like clearly unconscious. Okay, well that means that they're more complicit. That means that viewed it, they thought this should be on PornHub, they approved it, then they put ads on it and they put ads before it and around it. And if you press pause in the middle of one of these rape videos, you get an ad. And that's not even to go into the ads. Like their whole traffic junkie system was set up where they were allowing advertisers to place targeted ads on illegal content where, so I investigated that side too, but we don't have to go into that like rabbit hole. 0 (33m 11s): But that was, it was almost as bad as what was happening on PornHub because they had like this indicator bar for advertisers where they would say like how much traffic you would get on a particular word that would indicate rape, like literally the word for rape in Chinese or Arabic, or not 18 or teeny, you know, tiny. I I described like literally documented these and they had the indicator bar that would show how much traffic you would get on those particular videos that were tagged or titled that way. So that's why I saw like a obvious prepubescent child being raped on the video. And then above it it said an ad that said, delete your history after you click here, obviously targeting pedophiles, right? 0 (33m 56s): No 1 (33m 56s): Way. 0 (33m 57s): So the la so the layers of like complicity again, they just go so deep and they had one person. So this here is another thing. So MindGeek had 1800 employees, hundreds of millions of dollars that they're making every year on this content. They only employed one person five days a week to review flagged videos. They had a policy where they wouldn't even put a video in review, in queue for review unless it had 15 flags. So that means that a victim could flag their video 15 times and it wouldn't even be put in in the queue. 0 (34m 39s): And they had a backlog of 700, 6,000 flagged videos and the CEO of MindGeek and emails that were uncovered when he's talking about this policy, he calls it good and reasonable. So 1 (34m 54s): What's good and reasonable, the amount or 0 (34m 58s): His policy, the amount, yeah, like the system, they're kind of like talking about the fact that you would take, it would take 15 flags before it would be reviewed, the schedule of having the one person five days a week. You know, they're discussing the backlog of 700, 6,000 videos flagged. And he calls it literally quote unquote good and reasonable. So that's just some of the examples that 1 (35m 24s): Just blows my mind. Especially when you add in the layer of traffic, like the trafficking or I'm sorry, the, yeah, the like the web traffic, the ads, and then the key words. Because when you shoot for a MindGeek company, they're so strict on how you shoot the scene. So it's, you cannot have a scene where it, like even pretending you can't even pretend that you're drinking alcohol, let alone intoxicated. You can't pretend like you can't do rape fantasy, even though that is like one of the number one things that women like to consume. It's interesting, it, it just statistically is you can't film that for a MindGeek company because like, they don't even want a question mark around was this or wasn't this, but yet one their number one site, they have keywords for it that doesn't even make, 0 (36m 14s): Not even, and not just keywords, right? Like, I mean they would have 1 (36m 18s): Link to a video. 0 (36m 19s): Yeah. Like one of them is like titled Dead Pig Drugged Half An Eyes, I forget the exact one, but it's like drugged and the, and this is a video of like an unconscious woman or they have, you know, the titles indicating like, stop it hurts. No, don't you know, please stop. You know, all of this stuff is just blatantly there. And the titles, sometimes even the moderators would switch up titles, switch up tags behind the scenes. So they're, they, you know, part of this group of people who were working with the videos, you know, they weren't even called like internally, they weren't even called mo moderators. 0 (37m 4s): They obscured it with like referral source, review agent, like this random like title they used. But you know, part of their job was to actually help figure out how to optimize, so like search engine optimization. So the moderators themselves would say our job wasn't to keep illegal content off the site. It was to get to allow as much content to go through as possible to allow it to be on the site and to optimize it to make sure it had the right titles and tags and whatnot to be able to be picked up in popular search results and things like that. So, I mean, it's interesting to see the difference then between the studio side of things, right? 0 (37m 49s): And then the tube site and, but they feed on each other. So Mike, you know, part of their strategy was like, let's own the free sites and then let's own the paid sites. And then what we'll do is we'll advertise our paid sites on the free sites so they have like browsers and all of this, like advertise all over so people can kind of go to the paid sites too. So, but that's interesting to hear about. 1 (38m 14s): Yeah, that's why, that's why I always found it so interesting because I'm, there just couldn't be, there couldn't be any question marks. I'm like, so why, why is it okay here and it's not okay here? And then why are you guys so strict with paperwork here, but then you're stealing my content and putting it over here? It, it never made any sense to me. So how, or does anything change now that owner ownership 0 (38m 36s): Has changed? So they have made changes. So I'd love to talk about like what's happened but what still has yet to happen, you know, so quick like recap again, story told in the book this journey of all of this unfolding to the point where they deleted 80% of the site lost credit card companies, they lost their major advertisers were afraid, even like KY jelly and weed maps like wouldn't advertise on PornHub anymore. And the CEO and the CEO were the CEO and the COO were forced to resign. And then they sold the company to a hastily concocted private equity firm that ironically is called Ethical Capital Partners. 0 (39m 21s): So they named themselves, they created themselves to buy PornHub. It's not like this was like a private equity firm that existed previously that had done other things. Like they formed themselves to buy PornHub and they said we're gonna call ourselves Ethical Capital Partners. They renamed MindGeek. So they wanted to kind of distance from the toxic image of peddlers of crime and they called themselves ilo. So they rebranded in 2020 after all of this kind of really blew up. There was a article by the New York Times, the Children of PornHub that highlighted so many victim stories and the pressure was really put on them at that time. 0 (40m 3s): They were forced to take down that download button. 'cause that was just like an egregious feature that they had implemented on the site. They took that off. But you can still screen record. Exactly. Yeah. Not, not like it matters that much, but then they started to require the verification of uploaders. So that's like the change that happened at the end of 2020. And they said, we're overhauling the system. No anonymous uploaders. Now you have to verify yourself to upload. But that didn't, they knew it didn't solve the problem and they pretended that they solved the problem, but it didn't because for example, the 12-year-old boy I told you about, his abuser was named Rocky Sha Franklin. And we know this, he has a, his verification photo, like he was a verified uploader that was uploaded 23 rape videos of this 12-year-old child. 0 (40m 51s): And that is the case. Like we have many cases, a number of the victims that are currently suing PornHub. So there's almost 300 victims in 26 lawsuits that are currently suing PornHub across the us, Canada, and the uk. And multiple of those are class actions on behalf of tens of thousands, tens of thousands of child victims. But in many of these cases, they were verified uploaders that actually uploaded the abuse content and the trafficking content. So whoa, I knew that that didn't solve the problem because they weren't verifying who's in the videos. Okay. But they went on with that until now. Like, you know, they just in, in January of 2024. 0 (41m 31s): So just a couple months ago after they got charged, criminally charged by the US federal government for knowingly and intentionally profiting from the proceeds of the girls do porn sex trafficking operation out of San Diego. I don't know if you've heard about them, but they were, they were criminally charged for that. And they finally said, we're gonna start verifying the age and consent of people in new videos. But this doesn't fully take effect until September. So Ethical Capital Partners has owned PornHub for over a year and they have been profiting this whole time on unverified content. 0 (42m 12s): Now we estimates millions of videos that have been uploaded since the end of 2020 when they purged all that content. And even today, like there is clearly illegal content still on that site. There's also videos where it's, you know, for example, like a homeless, you know, they call it, they degrading will say this is a crack whore for who's, you know, doing this for $5. And you know, she has sores on her body. She's clearly vulnerable. She's saying like, get it over with. You can hear her like just, just not, not okay with what's happening, but you can't see her face. And so even if like it's unverifiable because you don't see their face. 0 (42m 56s): So how could they verify it anyway? So all that to say ethical capital partners to this very moment is profiting from and globally distributing videos ca scores of videos of unverified individuals and much of that is currently illegal. So 1 (43m 18s): I had no idea. I thought, I thought that in 2020 that they were making everyone submit IDs. So I knew that they got rid of the download button. And when I, I, I guess I misunderstood it because it's just the user, it wasn't the actual people, the participants in the video, which is insane 'cause it's like obviously that's not gonna do much solving. So what happened to the individuals that did for some reason verify their themselves before uploading abuse 0 (43m 44s): Content. So Rocky is in jail, like he's in prison for 40 years in that particular example. Yeah. 'cause they subpoenaed his verification image and they, they put him in jail. But the company that immortalized the boys' trauma, he's suing PornHub, him and his mother currently suing PornHub. I hope they win for immortalizing his trauma. You know, that's exactly what the victims say. Like they say it's one thing to be raped, it's another thing for that to be filmed and then globally distributed for the rest of my life for profit and pleasure. They say even after I'm dead, I know that people will be getting money and pleasure from the worst moments of my life. 0 (44m 25s): And them like often it just, they say it's like not even worth living. So, you know, some of these victims feel they can't even go to the grocery store without being so paranoid that somebody has seen their trauma. This might sound a little bit strange, but this is what I've heard from multiple victims is they're like, I'd rather have it on the dark web because it's not like it's either or, right? Like we don't want it on the dark web, we don't want it anywhere at all. But when it's on a site like PornHub where it's Google search, you know, like many victims for example, they had their name and the title, their school, their hometown Whoa. Where you could put their name in there and then it would pull up their Facebook and their exploitation video on the forehead. 0 (45m 12s): Yeah. And, and so it's like they were worried, did the person at the grocery store just watch my video? And so it's like this terrifying existence of paranoid, you know, life and it's just cri crippling and debilitating. And, and that's what happens. And ironically, at the same time, I talk about this in my book, the owners of PornHub and the VPs of PornHub were using fake identities. They were hiding their names, their faces, their self from the public for so many years. The owner of PornHub was hidden. Nobody knew who he was. The the majority owner of PornHub. 0 (45m 52s): And through the last four years he was uncovered. He was named, he was located. Now he's being sued. But for example, the VP of PornHub, he was out in the media speaking in thousands of articles in all their like PR stunts for like save the whales and save the Bees and donating for breast cancer research and getting all this press around all these kinds of stunts they were doing. And he was using fake names like Corey Price, Blake White, that was who he was. He he was Corey Iman. And they were hiding themselves at the same time. They were exploiting for profit, the bodies, the names, the identities of so many victims for so many years. 0 (46m 36s): So, 1 (46m 37s): So why, why does it seem that the same laws and regulations aren't applied to explicit content? Because this is where it all begins. Obviously there is a huge difference between an a, a pirated video going up and someone being abused in that going up. That's obvious. But I feel like I have to say that because it's the internet, there's a huge difference between them. But both are, both are illegal and because my guess is the reason that it has gone gotten so rampant is because no one has given a shit about the pirating because it's porn and people already, like, they don't wanna get involved, right? It's like, it's not worth defending for most people because again, it's seen as like lewd and taboo and it's just like, it's too too dirty. 1 (47m 26s): I don't wanna look at it. But because we've allowed so much pirating to happen and we're not enforcing any kind of copyright laws or regulations now, it's so hard to tell the difference between what is just being pirated versus what is actual abuse content. And if we had the same laws, like if you were a, and technically you do, but it's like who's gonna actually spend the money to, to sue and pursue this, especially when it's a hydra and they just keep getting uploaded anonymously on these other sites that are still co co-owned. You're actually making it more dangerous for innocent people. So even though people don't wanna come to the defense of the poor like porn stars, right? Because again, people don't feel like that's necessary. 1 (48m 7s): It's actually making it worse for civilians and for victims. Because if it was mainstream media and you were to put Iron Man three out on YouTube, you know, ripped off, you'd get sued to death for that. So if we demand more and like the same level of respect and and legal protection for explicit content, like it or not, you're actually creating such a, a, a much more safe ecosystem and preventing more victims because you're gonna know that anything out there is uploaded legally. Yeah. With consent by a production company or an individual, whatever it may be. And then end like if you have one strike against you than you can pursue that legally, right? Yeah. Whether it's pirated or it's abuse content. 0 (48m 47s): Totally. I'm a hundred percent agree with you and I've noticed that it's like people in the porn industry want Asian consent verification because they're so tired of their own pirated content. Like what? For one, because they don't want victims to be victimized on the sites that sh they share, right? They don't wanna share. And, and we've seen so many say, look, I'm off PornHub, I'm canceling my account. I do not wanna have anything to do with this because I don't want my video side by side with a video of abuse like that. But, and they want agent consent and they've been doing like in the right in 2, 2, 5, 7. It's normal. Everybody just, this is the way it is. Like this is the way the porn industry works. 0 (49m 29s): For some reason we allowed these online tube sites to get around it. Although technically like actually they are liable and I wish the federal government would pursue them because when you read the law it says that anybody who transfers pornographic content, so it's like produces or transfers. So they say, oh we're not the producers, we're just the conduit. We didn't produce it, but they're the ones profiting from it. But it does say the word transfer. And they were absolutely transferring because that download button, it wasn't like YouTube where, you know, it's like you can download it but you don't actually possess it on your device. 0 (50m 10s): It's just like you could stream it or you could watch it later without having to stream. No. Like actually there was a file that would then be downloaded onto that person's device that came directly from MindGeek servers that a hundred percent is a transfer of content. So they should be liable for millions of 2, 2, 5, 7 violations for not verifying that content. Mm. So I would just make that argument like, but aside from that, like they should be held to the same standard but for some reason it just, you know, hasn't happened. But it should. And I think that those in the industry would just be like, yes, I'm, we've been doing this forever and it makes sense to do it because it's gonna help us and it's gonna help non-consensual content from being uploaded. 0 (51m 2s): I was blown away back to like the copyrighted content. So you know, one of the, one of the things that might shock people the most in my book, and I hope I don't give, I don't wanna give away too much of it, but the former owner of PornHub, Fabian Tillman in the midst of this actually reached out to me saying he wanted to help very suspicious of his motives. And all of that is kind of revealed in the book what those ended up being. But one of the things he'd said was, if you want, aside from MindGeek doing the right thing and actually taking down abuse content, what victims should do is copyright their rape videos. 0 (51m 43s): And it's something like, I had never even thought of. Like you're suggesting that victims copyright their abuse videos so that they can do A-D-M-C-A take down request. 1 (51m 57s): Well it's more than A-D-M-C-A. So if you were to do that, so if you don't have the actual thing copy written, like you still own it and then you can do A-D-M-C-A and take it down and they'll take it down. And again, it'll be uploaded probably within a couple hours if you have the thing copy written, yes you can request it down, but it's an automatic fine for that whether or not you, but here's, you have to like prove who uploaded it and that's where it gets dodge like dicey. And you're probably not gonna be able to, to find it unless you have a really stringent team. But it's an automatic, it's either 25,000 or $50,000 fine per upload 0 (52m 32s): For the person that upload 1 (52m 34s): Uploads uploading if, if it's copywritten. So it's automatic, like there is no, there's no arguing. It, it's because it's, it's copy written. So if you do that also if you have a watermark, that's an additional fee. 'cause like I'm talking to my lawyers right now 'cause I have so much stuff out and I'm like, I don't know what to do because I have a team that I pay monthly to just go to these sites and try to take 'em down. They're down for a minute and they're back up and I'm like, I don't, he's like, well why don't we pursue it? And I was like, what? It's gonna, nothing's gonna come of it. And he's like, how about we do this? Like the next thing that you film, you have it copy written and then he was explaining all of this and then he's like, it's an automatic fine for both. I think maybe it's both parties, I have to look into that. 1 (53m 15s): But again, like you shouldn't have to do that to go 0 (53m 18s): Through those. You shouldn't have to. I mean like, I don't even know the cost, the just, you know, the process of having to go through that, you know, I mean it should just be like if there's a notice to remove a video, I'm not, I, there's a lot of people who are like, we need to have laws where, you know, they have 48 hours or something to no one, like one minute is too long because on a site that's getting 5 million visitors per hour, like it ha first of all it has to be prevented from going up in the first place. But if it does somehow get through, I mean it needs to be, they need to be like five minutes. Like you have like as soon as that comes in, it just has to get immediately taken down and then maybe reviewed after. 0 (54m 3s): But there shouldn't be this challenge, these hoops, these legal, you know, barriers and expense, right? I mean some people like, they couldn't pay to have that take down service. I know victims have complained, like they wanna hire the same like people, people that would do something like that. It's just like such a huge expense. The the girls do porn trafficking victims. I know, you know, in their litigation you can read how many of them had to hire services like that as well. But it just, yeah, there shouldn't be barriers at the same time, you know, mind geeks there profiting literally from every ad click, every ad impression, every view the longer it stays on the site. 0 (54m 47s): So, but I think we're in like total agreement on the need for agent consent verification. I 1 (54m 55s): Think it's, it's such an easy fix and that's why it blows my mind that it hasn't, like it just hasn't been accepted widely or just like required widely. It's the same requirements if you're shooting for a main, a mainstream studio. Yeah, it's, you have all of your paperwork, you have your 2, 2 5 sevens, you have two forms of IDs. You have an in intro interview saying, I am sober consenting of age, this is the date, hold up a newspaper. And then you have an exit interview. Was everything consensual that you just filmed? Are you consenting to releasing this film? Like Yeah. 0 (55m 30s): And where you're releasing it, right? Like 1 (55m 32s): Exactly. And like where exactly. And then that, and then that is the only way that it's allowed to be uploaded. Like it's, it's so simple and you, it's required by the studios. Why is it not required on the tube sites? I have no idea that 0 (55m 44s): Would work. I I was thinking about how do you come get around this problem of, like I described, there's so many homemade videos where you can't see the person's face and so how could they verify the ID if they can't see the person's face in the video? So they either have to not allow videos that don't show an identifiable face or do something what you're describing where it's the intro out, right? 1 (56m 11s): If you did, like if you did intro and outro, you would know is the same person. I think that would be, yeah. Yeah. It's, it's just, it's so simple and yeah, it's a couple extra 0 (56m 19s): Steps. Well here's the deal is because they don't wanna lose profit because the, you know, one moderator or one, you know, company insider described it like this, like they don't want any friction, right? In uploading they, because they're gonna get less videos for us Andto, the CEO and uncovered emails, when that whole thing was going down in 2020, he was complaining. He's like, we're getting 25,000 upload uploads a day with this new rule. Maybe we'll get 2000. And you know, it's, this is the thing is that they don't want friction, they don't want any barriers or any, you know, anything that would deter uploads because the business model relies, the tube site business model relies on enormous amounts of content. 0 (57m 4s): If they don't have content, they said it this way, content is king. That's what the MindGeek employees said. Content on the user generated porn sites content is Key King. And that's why when there was a child abuse video on the site, it would get reported. So first they would try to get it down. They can't, can't get it down. Right? So then it would go through the National Center for Missing and Exploited Children. They would verify that it's a child, they would go direct to MindGeek and then demand the video be taken down. And when that happened, you know, in some instances they would take that video down. But what they would do was they would leave that black box on PornHub and it would say a video removed at the request of ncmec National Center for Missing and Exploited Children admitting that there was a child abuse video there. 0 (57m 57s): But they would keep the link live, the exact same link, the title, the tags, the comments, the views, and then they would have ads right around it or in it in that box around it. They would have ads everywhere so that they could still get people to get that URL, all that content, all that inventory to be picked up on Google to drive people to that site, they would say, oh, this child abuse video's not here, but oh, here's this one, this one, this other similar, right? The algorithms gonna, and the views and the clicks and the impressions are all there to make the money. So that's like an example of they want content on the site. 0 (58m 40s): That's why they don't want verification. 1 (58m 45s): So geez. Yeah, that's Jock. You're like, well of course. Just take that page down. Yeah, 0 (58m 51s): But they don't want to because they need the inventory for Google. You know, the moderator said, Google's where PornHub lives or dies, if they don't have that URL, if they don't have those tags and those titles and that SEO to pick up, then they're not driving that traffic to their site. So they want it there, they need it. They're, it's intentional that they wanna keep it there. 1 (59m 14s): Oh boy. Yeah, that's, it's just, it's truly evil, all of that. Like, it's, it's just, there's no other word for it. Now, a challenging question. Your critics and the biggest critics of Trafficking Hub say that there is an ulterior motive to Trafficking Hub and Exodus cry. And it is like religious based, and it is to ultimately get to the point where all pornography is banned and all sex work is made illegal. Do you have like a comment or pushback on that? 1 (59m 54s): Yeah, 0 (59m 55s): I've answered that so many times over the last four years because it's like their number one attack. It's, you know, I, I talk about this in the book. It was like, there's two battles going on. You know, one is this battle against this illegal content and this true villain that is MindGeek and its owners. But then there's this other battle for the narrative around what is Trafficking Hub, what are the intentions, you know, like the truth in the media, right? Because one thing that was interesting is when I first wrote my op-ed, that blew this kind of open and I made damning accusations that were all true. This is not just accusations like based on evidence. Like I said, this is a Trafficking Hub, these are mega pimps that are just distributing child sex crimes and trafficking and blah, blah, blah. 0 (1h 0m 42s): And the only correction they had to the article was that they were technically based in Luxembourg for tax purposes because they couldn't correct anything. 'cause it was true. So this whole time, like they couldn't fight this on the merits. So within MindGeek, they hired a very powerful PR firm called five WPR out of New York, very big PR company owned by a man named Ron Ian, who has been condemned by the Public Relations Society of America as a stain on their profession because he would do things like create false websites to disparage his rivals and engage in all kinds of smear tactics and things like that. 0 (1h 1m 27s): So like this attack of emerging, this particular narrative about Trafficking Hub that was not factual, kind of, this is like the genesis of where it came from because they couldn't fight it on the merits. Like they couldn't just come out and say like, we're not doing that actually, you know, we have dah, dah, dah, dah, like all these policies. And it's not true that there's children on our site being abused. And they couldn't do that. So their only recourse was to defame, to discredit, to, to try to distract, to deny. And that's what they did. And this is, this is, this battle is detailed in my book, and it's been so hard to try to correct it and to to say like, no Trafficking Hub from the moment that it started was about illegal content. 0 (1h 2m 15s): It was about child sexual abuse, rape and trafficking. And it is not just a particular individual or organization, like 600 organizations have thrown down and been involved in Trafficking Hub. It's not something that's owned by any one. And I was really intentional about this. I'm like, it needs to be a movement where everybody, like in the porn industry, if you're pro porn, if you're anti-porn, if you're Muslim, if you're Christian, if you're atheist, whatever, we can agree that nobody should be raped and trafficked on the world's largest porn site. And actually, that's what happened. Trafficking emerged organically evolved as a movement of a very diverse group of people, which was the, which, which was the nightmare for MindGeek, right? 0 (1h 2m 58s): Because like if they can just say, these are just religious, crazy, right wing, sex hating p prudes that just wanna destroy the porn industry, don't listen to them, then they could be successful. Like they could just squash that, right? But the truth was, and is that it's not that, that it's so many people from many different backgrounds, survivors, people in the porn industry. And that's what's in my book. And you can actually see it unfold, right? And you can see the truth about what it is, what it was. And so that's, that's the truth about it. 0 (1h 3m 38s): You know, you are now, and there are people who are like anti-porn who are part of Trafficking Hub, and there are people who are pro porn and all the way in between, right? Well, 1 (1h 3m 46s): I guess as far as like the leadership rules go, are there, is there like any official leadership position on the topic of making either of those things illegal or restricted? 0 (1h 3m 59s): No. No, there's not that at all. And even with my organization, the Justice Defense Fund, like our mission is strictly and specifically about illegal content. You know, I make it clear that I'm not out to get rid of the legal pornography industry. What consenting adults do with each other is not my business. It's theirs. You know, as long as it's legal and you're not hurting another person, people are free to do what adults are free to do what they wanna do. This is about people who are not consenting, who are victims, who are children. And that's what this has always been about. And it, it's what it's about today. 0 (1h 4m 39s): And everybody's welcome to participate. So, 1 (1h 4m 44s): Well, you are doing really powerful work and I think it is incredible the changes that we've already seen with the biggest company on the planet in that industry. So hopefully the others follow suit. Can you tell the listeners how they can support your cause where they can get your book like plug away? 0 (1h 5m 2s): Yeah. So you can follow me on social media at Lila mle Away. I am always posting about all this. Anytime. If you wanna update on what's the latest, what's happening, like I'll post away on Twitter and mostly on Twitter and Instagram. And then my organization is the Justice Justice Defense Fund, and that's at justice defense fund.org. You can read the book, it's called Take Down Inside the Fight to Shut Down PornHub for Child Abuse, rape and Sex Trafficking. And that's available wherever books are sold. So any online major retailer like Amazon or Barnes and Noble, you can go to your local bookstore and get it. If they don't have it, you can order it. It's on Kindle, it's on Audible. I just, I can't, yeah, I, I just like, for the first time was hearing the Audible yesterday 'cause I, it's like I can't, I can't stand hearing my own voice. 0 (1h 5m 48s): But Anyway, it's on Audible, it's on Kindle, it's hard cover as well. And you can read that and, and my hope for that, all proceeds, all author royalty proceeds are being donated a hundred percent to the Justice Defense Fund for the first 3000 books that are sold. There's a donor who's doing a $30 match donation to the Justice Defense Fund as well. And my hope for that book is that people who read it will go on a journey of discovery with me that they will, you know, it's told in first person, some people are calling it a detective thriller style, you know, book where it's Paige Turner. Like people are like, I was up all night reading it and I'm like, that's amazing. I love, I love that. 0 (1h 6m 29s): But my hope is that you'll go on this journey, you'll meet the victims, you'll meet the whistleblowers, you'll, you'll actually experience what I experienced. And not that I want people to be traumatized by witnessing the videos that I witnessed, but I think it's important for people to get the truth. And you know, it's the truth on the public record. And the truth I think is what's gonna, you know, ultimately paved the way for freedom for so many people and justice. And so you can get the book and then you can also join what I, we created Team Takedown. So Team Takedown is like a commitment to become like a monthly supporter for as little as like $10 a month. Read the book, sign the petition, the Trafficking Hub petition. 0 (1h 7m 10s): You can still do that. People are signing it every day. And we wanna take, not only take down PornHub, we wanna take down illegal content across the internet. And that's our goal. So you can do that too. 1 (1h 7m 21s): Amazing, Layla, and again, thank you so much and I'll make sure I link that below for everyone to check out all your stuff. 0 (1h 7m 27s): Awesome. Thanks for having me on. It was so great to talk to you. You too. 1 (1h 7m 32s): And that's it for this week's episode of Chatting with Candice. Before I go, please click that five star review, make sure you like and subscribe. And we'll see you next week. Bye everybody.