The Fight For Change w/ Dr. Marcel van der Watt

In this powerful episode of Next Steps Forward, host Dr. Chris Meek sits down with Dr. Marcel van der Watt, the president of the National Center on Sexual Exploitation (NCOSE), to discuss his remarkable career and the global fight against sexual exploitation. With over 22 years of experience, Dr. van der Watt brings expertise from his background as a police investigator, hostage negotiator, and criminal case consultant. Together, they dive into the devastating impact of pornography, especially on young minds, and the alarming rise of online sexual exploitation. Dr. van der Watt sheds light on how pornography is reshaping relationships, desensitizing youth, and potentially leading to more violent behaviors. He also provides insights into how easy access to these harmful materials is impacting children, and shares effective strategies for parents to protect their kids in the digital age.
The conversation also explores the important work of the NCOSE, including their efforts in policy advocacy, legal action, corporate accountability, and survivor support. Dr. van der Watt discusses the TAKE IT DOWN Act, which aims to combat online sexual exploitation, and the center’s annual Dirty Dozen list, highlighting the biggest corporate enablers of exploitation. Tune in for this eye-opening discussion on the complex and urgent issue of sexual exploitation, and learn what steps we can all take toward change.
About Dr. Marcel van der Watt: As President of the National Center on Sexual Exploitation, Marcel van der Watt (Ph.D.) brings over 22 years of expertise in combating the interconnected criminal economies of organized crime, human trafficking, and sexual exploitation. Marcel has earned global recognition for his leadership, investigative expertise, and research contributions. His extensive background as a police investigator, hostage and suicide negotiator, and criminal case consultant continues to inform and inspire his work today. Marcel has consulted in and provided expert court testimony in several sex trafficking cases, briefed members of the US Congress on issues related to online criminality, and engaged with both the Presidency and the Multi-Party Women’s Caucus in South Africa’s Parliament on matters related to implicit harms-,myths-, and laws associated with the sex trade. As a keynote speaker and published author, Marcel has shared his expertise on international stages, training professionals from diverse disciplines across over 30 countries. His deep commitment to justice is rooted in his personal faith in Jesus Christ. Married to his best friend Karolien, Marcel finds great joy in seeing others thrive, pursue their dreams, and hone their talents. He is passionate about celebrating and protecting the inherent dignity and value of every individual, inspiring them to live life to its fullest potential.
00:00:00,000 --> 00:00:08,240
There are few things that make people successful.
2
00:00:08,240 --> 00:00:13,260
Taking a step forward to change their lives is one successful trait, but it takes some
3
00:00:13,260 --> 00:00:14,840
time to get there.
4
00:00:14,840 --> 00:00:18,680
How do you move forward to greet the success that awaits you?
5
00:00:18,680 --> 00:00:22,640
Welcome to Next Steps Forward with host Chris Meek.
6
00:00:22,640 --> 00:00:30,200
Each week, Chris brings on another guest who has successfully taken the next steps forward.
7
00:00:30,200 --> 00:00:32,200
Now here is Chris Meek.
8
00:00:32,200 --> 00:00:33,200
Hello.
9
00:00:33,200 --> 00:00:37,560
You've tuned in this week's episode of Next Steps Forward, and I'm your host, Chris Meek.
10
00:00:37,560 --> 00:00:39,960
As always, it's a pleasure to have you with us.
11
00:00:39,960 --> 00:00:43,240
Next Steps Forward is committed to helping others achieve more than ever while experiencing
12
00:00:43,240 --> 00:00:45,960
greater personal empowerment and wellbeing.
13
00:00:45,960 --> 00:00:48,800
Our guest today is Dr. Marcel Van Der Watt.
14
00:00:48,800 --> 00:00:52,720
As president of the National Center on Sexual Exploitation, Marcel Van Der Watt brings more
15
00:00:52,720 --> 00:00:57,720
than 22 years of expertise in combating the interconnected criminal economies of organized
16
00:00:57,720 --> 00:01:01,640
crime, human trafficking, and sexual exploitation.
17
00:01:01,640 --> 00:01:05,280
Marcel has earned a global recognition for his leadership, investigative expertise, and
18
00:01:05,280 --> 00:01:07,080
research contributions.
19
00:01:07,080 --> 00:01:11,480
His extensive background as a police investigator, hostage and suicide negotiator, and criminal
20
00:01:11,480 --> 00:01:15,320
case consultant continues to inform and inspire his work today.
21
00:01:15,320 --> 00:01:19,240
As a keynote speaker and published author, Marcel shares his expertise on international
22
00:01:19,240 --> 00:01:23,640
stages to train professionals from diverse disciplines across more than 30 countries.
23
00:01:23,640 --> 00:01:27,880
Dr. Marcel Van Der Watt, welcome to Next Steps Forward.
24
00:01:27,880 --> 00:01:28,880
Thank you, Chris.
25
00:01:28,880 --> 00:01:32,480
Thank you for having me, and hello to your viewers.
26
00:01:32,480 --> 00:01:35,880
It's an absolute pleasure and honor to have you here.
27
00:01:35,880 --> 00:01:39,520
We mentioned before the show how this is such a huge topic for me and for my listeners,
28
00:01:39,520 --> 00:01:44,600
and so thank you for what you do, and thank you for taking time to talk to us today.
29
00:01:44,600 --> 00:01:47,440
There are so many places we could start a conversation, and I'd originally tended to
30
00:01:47,440 --> 00:01:51,520
dive into the harmful effects of pornography later in our conversation, but there was a
31
00:01:51,520 --> 00:01:56,320
shocking and disturbing article titled Sex Without Women by Caitlin Flanagan in The Atlantic
32
00:01:56,320 --> 00:01:57,900
last week.
33
00:01:57,900 --> 00:02:03,040
It begins with the question, quote, what happens when men prefer porn, end quote, and discusses
34
00:02:03,040 --> 00:02:06,840
how the overwhelming prevalence of online pornography has changed relationships between
35
00:02:06,840 --> 00:02:08,480
young men and young women.
36
00:02:08,920 --> 00:02:14,680
Flanagan writes, watching online porn has become most adolescents' first sexual experience.
37
00:02:14,680 --> 00:02:19,040
The average 14-year-old boy today has seen more hardcore porn than all the American fighting
38
00:02:19,040 --> 00:02:21,600
forces in the Second World War, end quote.
39
00:02:21,600 --> 00:02:26,200
And then she goes on, because of the internet's power to desensitize people and wear down
40
00:02:26,200 --> 00:02:30,800
their natural responses to shocking things, and because of the way these algorithms work,
41
00:02:30,800 --> 00:02:34,640
young people quickly proceed to more and more extreme videos.
42
00:02:34,640 --> 00:02:38,760
And as it has always been, these earliest experiences of sexual events pass deeply into
43
00:02:38,760 --> 00:02:42,080
their sense of what sex should be, end quote.
44
00:02:42,080 --> 00:02:45,280
Can you pick up that thread and take us further into that question of what happens when men
45
00:02:45,280 --> 00:02:49,160
prefer porn, and what's going on with teenagers and porn?
46
00:02:49,160 --> 00:02:57,760
Well I think, yeah, it's a sobering statement, and just the vignette that you used there,
47
00:02:57,760 --> 00:02:58,760
Chris.
48
00:02:58,760 --> 00:03:02,600
But I mean, as a parent, obviously it's completely understandable to feel alarmed, you know,
49
00:03:02,600 --> 00:03:09,040
because easily kids stumble into the world of online pornography, and as you rightfully
50
00:03:09,040 --> 00:03:15,920
say, a lot of these exposures begins at a very, very young age.
51
00:03:15,920 --> 00:03:23,960
And you know, the research is also very clear about the risks and the harms of pornography,
52
00:03:23,960 --> 00:03:31,760
and it's never just a benign viewing of obscene content.
53
00:03:31,920 --> 00:03:39,800
But there's very real impact on the brain, and the way we men view the world, and obviously
54
00:03:39,800 --> 00:03:42,280
there's a separate conversation about women as well.
55
00:03:42,280 --> 00:03:49,600
There's also new evidence that suggests that there's an increasing pornography consumption
56
00:03:49,600 --> 00:03:55,640
among women as well, which is also something that we, you know, you could probably draw
57
00:03:55,640 --> 00:04:00,720
a correlation or nexus with increasing number of women.
58
00:04:00,720 --> 00:04:05,280
We've seen recently some of these female teachers also being, you know, apprehended
59
00:04:05,280 --> 00:04:12,000
for sexual abuse complaints against, you know, scholars, so there's something to unpack just
60
00:04:12,000 --> 00:04:17,000
on that issue when it comes to women and pornography consumption.
61
00:04:17,000 --> 00:04:23,200
But just back to your question, very real world effects, Chris, and it never stays there.
62
00:04:23,200 --> 00:04:29,880
There's also evidence suggesting and showing how people act out on that, you know, behaviorally,
63
00:04:29,880 --> 00:04:35,320
the correlation with sexual aggression, and also the content of the pornography and the
64
00:04:35,320 --> 00:04:42,520
different themes that we see in pornography is really nauseating and extremely problematic.
65
00:04:42,520 --> 00:04:48,920
And you know, now we've read a lot over the last two, three years about Pornhub and, you
66
00:04:48,920 --> 00:04:56,280
know, the realities and the crimes associated with this massive pornography host, just the
67
00:04:56,280 --> 00:05:04,760
criminal content on there from child sexual abuse material, bestiality, rape videos.
68
00:05:04,760 --> 00:05:11,520
So you know, it's tragic, but also people viewing that, it's not benign.
69
00:05:11,520 --> 00:05:16,960
And a lot of that is internalized, and there's enough research showing how people act out
70
00:05:16,960 --> 00:05:19,780
and also the link with sexual aggression.
71
00:05:19,780 --> 00:05:22,280
So deeply problematic, Chris.
72
00:05:22,280 --> 00:05:26,320
Well, and this article I just quoted really hit me because, you know, my wife and I have
73
00:05:26,320 --> 00:05:30,040
a son who literally just turned 13 a week ago.
74
00:05:30,040 --> 00:05:32,520
And so he's right there at that age.
75
00:05:32,520 --> 00:05:37,160
And then you touched about how some female teachers have been getting arrested and caught
76
00:05:37,160 --> 00:05:39,960
and indicted for things that they've done with students.
77
00:05:39,960 --> 00:05:44,160
Just over a year ago, there was a big FBI sting in South Florida called March Sadness,
78
00:05:44,160 --> 00:05:47,120
the time with March Madness for men's basketball.
79
00:05:47,120 --> 00:05:54,480
33 people were arrested, teachers, high school coaches, members of clergy.
80
00:05:54,480 --> 00:05:59,440
People who are in your community, people you trust your children with every day.
81
00:05:59,440 --> 00:06:01,680
And it shows that it can happen in your community.
82
00:06:01,680 --> 00:06:05,440
And so I think just having this conversation, this dialogue, I want people to open their
83
00:06:05,440 --> 00:06:09,400
eyes, open their ears, and to realize it is happening in your front and backyard.
84
00:06:09,400 --> 00:06:12,600
No one is, you know, going to avoid this.
85
00:06:12,600 --> 00:06:13,600
Yeah.
86
00:06:13,680 --> 00:06:17,360
You know, coming back to children, I mean, children are, you know, digital natives, you
87
00:06:17,360 --> 00:06:22,640
know, surrounded by devices, smartphones, tablets, laptops, gaming consoles, and smart
88
00:06:22,640 --> 00:06:23,640
TVs.
89
00:06:23,640 --> 00:06:29,120
And, you know, those, you know, those tools, I mean, they are wonderful and just, you know,
90
00:06:29,120 --> 00:06:34,280
innovation, human innovation, ingenuity, which allows us to create these things.
91
00:06:34,280 --> 00:06:39,680
But they are also, you know, they also do act as gateways to, you know, to the Internet.
92
00:06:39,760 --> 00:06:46,120
You know, we've published a one-page research document about two or three years ago where
93
00:06:46,120 --> 00:06:54,160
we talk about the most dangerous playground is literally nowadays in the pockets of children.
94
00:06:54,160 --> 00:06:59,280
And you know, hardcore pornography is really is readily available, it's frequently violent,
95
00:06:59,280 --> 00:07:01,800
and it's poorly controlled.
96
00:07:01,800 --> 00:07:06,360
And I mean, you know, we all know if we think about the formative years and, you know, just
97
00:07:06,400 --> 00:07:13,160
the different developmental stages of children and how the brain develops, you know, the
98
00:07:13,160 --> 00:07:15,840
impact is significant.
99
00:07:15,840 --> 00:07:20,680
So you know, I think we should all be far, far more serious and indignant about these
100
00:07:20,680 --> 00:07:21,680
things.
101
00:07:21,680 --> 00:07:28,080
So these are very, very real existential risks and harms and threats that's, yeah, that's
102
00:07:28,080 --> 00:07:30,280
literally in our pockets.
103
00:07:30,280 --> 00:07:34,640
You know, again, as a parent of a teenager, are there parental controls that we can put
104
00:07:34,640 --> 00:07:35,640
on their devices?
105
00:07:35,920 --> 00:07:39,880
I know I Googled, you know, how to put on parental controls when my son got an iPhone,
106
00:07:39,880 --> 00:07:40,880
but they're smarter than us.
107
00:07:40,880 --> 00:07:41,880
We know that.
108
00:07:41,880 --> 00:07:42,880
They find their workarounds.
109
00:07:42,880 --> 00:07:46,320
Are there things that we can download or have our phones control theirs?
110
00:07:46,320 --> 00:07:53,600
Yeah, Chris, you know, to your point, it's, you know, there's several tools and applications
111
00:07:53,600 --> 00:07:57,360
out there that one can use that's downloadable.
112
00:07:57,360 --> 00:08:03,640
Some of these are free, and some of them comes at a cost.
113
00:08:03,640 --> 00:08:09,200
But the problem here is, unfortunately, this is where we should be pushing back.
114
00:08:09,200 --> 00:08:13,720
You know, this should not be primarily the parents' responsibilities.
115
00:08:13,720 --> 00:08:22,280
To your point is, you know, children grows up in this whole new world, and they are,
116
00:08:22,280 --> 00:08:26,560
you know, more often than not, they are more tech-savvy than parents.
117
00:08:26,560 --> 00:08:32,480
And often these, on a device level, you know, some of these device and app level, sometimes
118
00:08:32,840 --> 00:08:37,880
they make it so difficult and so cumbersome for parents to navigate.
119
00:08:37,880 --> 00:08:42,520
I mean, there was once about two years ago, we did a little internal study, and we kind
120
00:08:42,520 --> 00:08:50,560
of counted about 14 or 15 steps that a parent had to take in order to make a device safe
121
00:08:50,560 --> 00:08:51,560
for the children.
122
00:08:51,560 --> 00:08:59,280
So parents feel overwhelmed, you know, and anxious just about the thought about their
123
00:08:59,280 --> 00:09:04,880
children, even though a child may not yet have a phone, but being among their peers
124
00:09:04,880 --> 00:09:07,720
where phones are so readily accessible.
125
00:09:07,720 --> 00:09:08,720
So it is a problem.
126
00:09:08,720 --> 00:09:14,560
But again, you know, this is again where we need to start holding accountable, you know,
127
00:09:14,560 --> 00:09:18,520
big tech, the technology companies out there.
128
00:09:18,520 --> 00:09:24,040
So and that's the fight in which we are in, but you're right, I mean, again, parents feeling
129
00:09:24,040 --> 00:09:25,600
overwhelmed out there.
130
00:09:26,120 --> 00:09:30,200
Well, parents also need to realize, you know, Johnny and Susie, just because they're cute
131
00:09:30,200 --> 00:09:32,440
little Johnny and Susie, they might be looking for this stuff too.
132
00:09:32,440 --> 00:09:35,520
And so they can't just assume, not my kid, somebody else's.
133
00:09:35,520 --> 00:09:36,520
Absolutely.
134
00:09:36,520 --> 00:09:41,440
And so as great as someone's kid can be, they can also be surfing for different things.
135
00:09:41,440 --> 00:09:42,440
Yeah.
136
00:09:42,440 --> 00:09:43,440
That's very true.
137
00:09:43,440 --> 00:09:47,080
I know you work on the international level, but you mentioned Pornhub specifically and
138
00:09:47,080 --> 00:09:48,560
some other sites like that.
139
00:09:48,560 --> 00:09:53,160
I know there are different state laws where you would need to register your age and I
140
00:09:53,160 --> 00:09:56,600
think sign up with your name or email or something like that, you know, to become verified
141
00:09:56,600 --> 00:09:58,960
if you're approved that you're 18.
142
00:09:58,960 --> 00:10:01,720
Are you working with states on that or should there be a federal policy for that?
143
00:10:01,720 --> 00:10:03,280
And again, this is just your opinion.
144
00:10:03,280 --> 00:10:04,280
Sure.
145
00:10:04,280 --> 00:10:12,400
I mean, you know, again, when we're talking about the issue of age verification, you know,
146
00:10:12,400 --> 00:10:20,360
we have in January 15th, you know, our case was, you know, we acted as an amicus in a
147
00:10:20,360 --> 00:10:24,080
federal matter for age verification.
148
00:10:24,080 --> 00:10:30,600
So I mean, ultimately, you know, this should be something we address on a, you know, on
149
00:10:30,600 --> 00:10:31,600
a national level.
150
00:10:31,600 --> 00:10:36,880
But there's good things that are happening at state level.
151
00:10:36,880 --> 00:10:44,480
And you know, the current court challenge, the Supreme Court challenge, where we are,
152
00:10:44,480 --> 00:10:49,800
one of the, you know, submitted the amicus brief was in response to the Texas age verification
153
00:10:49,800 --> 00:10:50,800
law.
154
00:10:50,800 --> 00:10:54,200
So, you know, so it's common sense.
155
00:10:54,200 --> 00:10:56,840
Age verification should be common sense.
156
00:10:56,840 --> 00:11:01,920
And I think a lot of these big tech companies are pushing back and they are using all of
157
00:11:01,920 --> 00:11:06,960
these, you know, these theoretical arguments about, you know, really theoretical about
158
00:11:06,960 --> 00:11:07,960
free speech.
159
00:11:07,960 --> 00:11:11,920
At the end of the day, when you create something, we think about a vehicle, a motor vehicle,
160
00:11:11,920 --> 00:11:12,920
a car.
161
00:11:12,920 --> 00:11:20,840
You know, if you know that that car is able to do harm with a wittingly unwittingly on
162
00:11:20,840 --> 00:11:26,640
the side of the driver or the engine, where some malfunction can occur, you know, that's
163
00:11:26,640 --> 00:11:27,840
why there are breaks.
164
00:11:27,840 --> 00:11:29,680
That's why there are safety belts.
165
00:11:29,680 --> 00:11:36,560
That's why there are these rigorous, rigorous tests and standards that any product for that
166
00:11:36,560 --> 00:11:41,080
matter goes through in order to make sure that the end user is safe.
167
00:11:41,240 --> 00:11:46,280
And really, age verification is just one of those, you know, and there's so many examples
168
00:11:46,280 --> 00:11:53,120
of the fact that it can become a reality and be non-invasive and not infringe on any of
169
00:11:53,120 --> 00:11:54,120
these rights.
170
00:11:54,120 --> 00:11:59,720
So a lot of these arguments is really purely distractions.
171
00:11:59,720 --> 00:12:07,600
And it's just a way to try and circumvent measures in place to avoid accountability
172
00:12:07,600 --> 00:12:13,880
because ultimately we know that we will succeed on the issue of age verification.
173
00:12:13,880 --> 00:12:18,400
And when that happens, technology companies, developers will be held accountable.
174
00:12:18,400 --> 00:12:22,600
And I think that is the long game that we are in.
175
00:12:22,600 --> 00:12:29,120
And, you know, opposition to these laws are, you know, they also know that ultimately accountability
176
00:12:29,120 --> 00:12:35,360
is going to come and it is going to impact the bottom dollar.
177
00:12:35,360 --> 00:12:36,360
So yeah.
178
00:12:37,120 --> 00:12:40,640
And to your point about the technology companies and their algorithms, you know, it's baseball
179
00:12:40,640 --> 00:12:44,800
season here in the Northeast and my son's starting baseball and a few weeks ago we got
180
00:12:44,800 --> 00:12:46,560
him a new baseball bat.
181
00:12:46,560 --> 00:12:51,360
And the next day I log into wallstreetjournal.com and the top banner was the actual baseball
182
00:12:51,360 --> 00:12:54,400
bat that I bought the day before.
183
00:12:54,400 --> 00:12:55,720
It's crazy, Chris.
184
00:12:55,720 --> 00:12:56,720
And you know what?
185
00:12:56,720 --> 00:13:00,320
I'm going to use one example, a similar example.
186
00:13:00,320 --> 00:13:07,160
About three months ago, I spoke to a family member in South Africa and we spoke about
187
00:13:07,160 --> 00:13:08,520
some random thing.
188
00:13:08,520 --> 00:13:15,680
The issue of a lack of sleep came up and we just had a conversation for about seven minutes.
189
00:13:15,680 --> 00:13:17,480
That was a Saturday morning.
190
00:13:17,480 --> 00:13:18,800
I was walking my dog.
191
00:13:18,800 --> 00:13:26,760
I went up the stairs into the condo and it was about 11 o'clock EST translated to South
192
00:13:26,760 --> 00:13:27,760
Africa.
193
00:13:27,760 --> 00:13:32,400
And it was about 5 p.m. in South African time and there was going to be rugby on.
194
00:13:32,400 --> 00:13:36,280
I was switching to rugby through a YouTube channel exactly.
195
00:13:36,280 --> 00:13:37,280
That's what I got fed.
196
00:13:37,280 --> 00:13:41,240
You know, are you having problems sleeping, you know?
197
00:13:41,240 --> 00:13:47,440
Sleeping medication and doctors and prescriptions and all of these things coming in.
198
00:13:47,440 --> 00:13:49,000
So it's incredible.
199
00:13:49,000 --> 00:13:55,960
It's incredible how, you know, the landscape in which we are and it's deeply, deeply concerning
200
00:13:55,960 --> 00:13:56,960
and problematic.
201
00:13:57,160 --> 00:13:59,160
And no one's immune to it.
202
00:13:59,160 --> 00:14:00,160
Yeah.
203
00:14:00,160 --> 00:14:05,840
Back to Flanagan's article, she also writes, quote, you can't spend 15 minutes scrolling
204
00:14:05,840 --> 00:14:09,600
through a porn site without coming across a video in which a woman seems to be not performing
205
00:14:09,600 --> 00:14:13,280
fear or pain, but actually experiencing those things.
206
00:14:13,280 --> 00:14:16,920
If you're one of those people who enjoy watching coarse sex, you'll never be bored for a second
207
00:14:16,920 --> 00:14:18,320
of your life.
208
00:14:18,320 --> 00:14:23,040
As far as the moral equations of watching porn go, the one that matters is are you excited
209
00:14:23,040 --> 00:14:27,160
by the obvious abuse of women or have you learned to countenance that abuse as a necessary
210
00:14:27,160 --> 00:14:32,640
cost of your own pleasure and which of those is worse, end quote.
211
00:14:32,640 --> 00:14:36,360
Would you speak to the crux of that question and the minds of males who consume that material?
212
00:14:36,360 --> 00:14:39,360
And I say males, you know, men and boys as well.
213
00:14:39,360 --> 00:14:40,360
Yeah.
214
00:14:40,360 --> 00:14:41,360
Yeah.
215
00:14:41,360 --> 00:14:42,640
Well, it comes also down to a few things.
216
00:14:42,640 --> 00:14:49,600
I mean, one of those are habituation, you know, and, you know, and there's again, there's
217
00:14:49,600 --> 00:14:51,760
plenty of research on this.
218
00:14:51,760 --> 00:14:59,200
You know, it starts off with, you know, there is this word, we don't often use it at Nicosi,
219
00:14:59,200 --> 00:15:03,000
but, you know, it's soft porn or what they call vanilla porn, which is, you know, barely
220
00:15:03,000 --> 00:15:13,880
shows any nudity and the more consumption there is, the more, you know, the more explicit
221
00:15:13,880 --> 00:15:22,000
the material becomes and, you know, the more arousal, you know, happens in tandem with
222
00:15:22,000 --> 00:15:28,320
how, you know, concomitant to how explicit and violent material becomes.
223
00:15:28,320 --> 00:15:32,840
And the brain, you know, obviously the tendencies habituation happens and we also know about
224
00:15:32,840 --> 00:15:41,720
neuroplasticity and how the brain can be altered by things like pornography and what we view.
225
00:15:41,720 --> 00:15:47,800
And yeah, to your point, it's at the end of the day, there really is a demand for these
226
00:15:47,800 --> 00:15:50,600
kind of material.
227
00:15:50,600 --> 00:15:58,400
And again, it never stops there, you know, you actually have people being aroused and
228
00:15:58,400 --> 00:16:04,520
this is very disturbing, but by rape scenes, you know, by things like urophilia and coprophilia
229
00:16:04,520 --> 00:16:12,400
where there are these extreme and obscene, you know, images and content being played
230
00:16:12,400 --> 00:16:16,240
out, even bestiality.
231
00:16:16,240 --> 00:16:18,960
And there's also obviously the whole thing with sexual aggression.
232
00:16:18,960 --> 00:16:27,000
So it's deeply troubling and that's obviously where, you know, problematic pornography use
233
00:16:27,000 --> 00:16:33,440
comes in, where there is this almost this, you know, this physical need that is being
234
00:16:33,440 --> 00:16:37,080
developed in order to watch this material.
235
00:16:37,080 --> 00:16:43,560
And again, it never stays there, there's always some form of acting out that does happen.
236
00:16:43,560 --> 00:16:47,160
So this is not benign.
237
00:16:47,160 --> 00:16:49,560
You know, marijuana has been described as a gateway drug.
238
00:16:49,560 --> 00:16:52,960
You know, we've been talking a little bit about sex offenders.
239
00:16:52,960 --> 00:16:56,440
Is there any evidence that pornography is a gateway drug to acting out antisocial or
240
00:16:56,440 --> 00:16:57,440
violent behavior?
241
00:16:57,440 --> 00:17:08,160
Well, there really is, there is really a lot of evidence that points to that, Chris.
242
00:17:08,160 --> 00:17:15,320
You know, we've, I mean, there's just one example entitled male peer support and sexual
243
00:17:15,320 --> 00:17:20,760
assault, the relationship between high profile high school sports participation and sexual
244
00:17:20,760 --> 00:17:22,880
predatory behavior.
245
00:17:22,880 --> 00:17:27,720
And just one of the things that they found in that study was, you know, men who more
246
00:17:27,720 --> 00:17:34,360
frequently consume pornography more readily expressed a greater desire to act out the
247
00:17:34,360 --> 00:17:41,120
sexual fantasies involving coerced, intoxicated or forced sex and sexual assault.
248
00:17:41,120 --> 00:17:47,760
If they obviously were assured that they would not be caught, you know, that's one of the
249
00:17:47,760 --> 00:17:49,080
studies.
250
00:17:49,080 --> 00:17:53,440
And then there's also, you know, we I'm just giving you a little bit of a heads up here,
251
00:17:53,440 --> 00:18:01,680
but we we've just concluded a study on five ways that pornography and the pornography
252
00:18:01,680 --> 00:18:05,000
industry contributes to sex trafficking.
253
00:18:05,000 --> 00:18:07,220
And I'm just going to list five of them for you.
254
00:18:07,220 --> 00:18:08,220
And I've seen this.
255
00:18:08,220 --> 00:18:09,600
I'll just quickly qualify this, Chris.
256
00:18:09,600 --> 00:18:16,120
I've done, you know, I've infiltrated criminal networks in my work as a former law enforcement
257
00:18:16,120 --> 00:18:22,600
officer and a lot of the work in one way, shape or form, it would involve a brothel,
258
00:18:22,600 --> 00:18:31,200
you know, and I cannot recall ever going into a brothel, you know, as a as a law enforcement
259
00:18:31,200 --> 00:18:39,680
officer where there was not a constant stream of pornography playing either in the reception
260
00:18:39,680 --> 00:18:43,960
area, the lobby area of the brothel or in the different rooms.
261
00:18:43,960 --> 00:18:50,920
So pornography is a consistent and a coherent thread in all cases related to brothels that
262
00:18:50,920 --> 00:18:55,840
I've investigated and responded to, but also sex trafficking, you know, and just some of
263
00:18:55,840 --> 00:19:01,680
the points that that we that we've documented and we'll we'll release this soon as part
264
00:19:01,680 --> 00:19:02,760
of a fact sheet.
265
00:19:02,760 --> 00:19:07,840
But five of them are pornography is used to groom young victims of sex trafficking who
266
00:19:07,840 --> 00:19:09,680
lack sexual experience.
267
00:19:09,680 --> 00:19:10,680
That's the one.
268
00:19:10,680 --> 00:19:15,400
The second one is women, men and girls and boys are sex trafficked for the production
269
00:19:15,400 --> 00:19:16,400
of pornography.
270
00:19:16,400 --> 00:19:23,880
Again, pornography kind of in the middle of this entire ecosystem of abuse and sex trafficking.
271
00:19:23,880 --> 00:19:29,200
The third one, pornography is used as advertising for those sex traffic within prostitution
272
00:19:29,200 --> 00:19:30,200
marketplaces.
273
00:19:30,200 --> 00:19:32,800
And then these two are very interesting.
274
00:19:32,800 --> 00:19:38,920
Pornography fuels the demand for prostitution and as a result, it fuels sex trafficking.
275
00:19:38,920 --> 00:19:44,160
And it's also shown, research has shown that men who are frequently pornography users are
276
00:19:44,160 --> 00:19:49,000
three times more likely to purchase sex than other men.
277
00:19:49,000 --> 00:19:54,480
And then the fifth one, some men who purchase sexual access to people in the sex trade use
278
00:19:54,480 --> 00:20:04,400
pornography to build sexual excitement in advance and may act out what they see in pornography.
279
00:20:04,400 --> 00:20:12,440
And again, I've seen this in my work where, you know, where men would enter into brothels
280
00:20:12,440 --> 00:20:19,380
and they would inquire whether they are, you know, sometimes they ask very explicitly for
281
00:20:19,380 --> 00:20:26,520
somebody that's of a younger age, which inevitably denotes a child and somebody who's willing
282
00:20:26,520 --> 00:20:32,600
to do some extreme things, you know, and some of those I've mentioned, you know, are things
283
00:20:32,640 --> 00:20:38,880
like coprophilia, urophilia, where, and this is things they see on pornography, you know,
284
00:20:38,880 --> 00:20:44,760
where they are sexually aroused by being able to urinate on another person or able to defecate
285
00:20:44,760 --> 00:20:46,440
on another person.
286
00:20:46,440 --> 00:20:51,280
These are extreme things, but nobody is born with that kind of pathology and a lot of this
287
00:20:51,280 --> 00:20:57,120
is absolutely filled and amplified by pornography.
288
00:20:57,120 --> 00:21:05,760
And some of these themes, when we think about strangulation and even race-related themes,
289
00:21:05,760 --> 00:21:10,200
I mean, serial rape, I mean, one of these things in, you know, I've investigated cases
290
00:21:10,200 --> 00:21:16,200
of serial rape, you know, when you have the same offender, well, you don't know yet it's
291
00:21:16,200 --> 00:21:21,200
the same offender, but we talk about modus operandi and we talk about a specific signature
292
00:21:21,200 --> 00:21:24,640
and a signature is that one thing that just literally stands out.
293
00:21:24,640 --> 00:21:30,680
And I remember this one matter that I was involved in is that the person would always
294
00:21:30,680 --> 00:21:35,800
ask, he would walk past a woman and would ask, you know, do you have $2 that you can
295
00:21:35,800 --> 00:21:37,160
borrow me?
296
00:21:37,160 --> 00:21:41,600
And then that would translate into a rape, there's always a strangulation, and there's
297
00:21:41,600 --> 00:21:48,760
always some reference to asking the person to be positioned in a specific way.
298
00:21:48,760 --> 00:21:51,200
And pornography usually fuels these things.
299
00:21:51,200 --> 00:21:54,560
Nobody's born with these ideas or concepts.
300
00:21:55,480 --> 00:22:00,920
So, yeah, it's pervasive and it has very, very real world impact on sexual abuse and
301
00:22:00,920 --> 00:22:04,760
exploitation, Chris.
302
00:22:04,760 --> 00:22:07,520
It's just sickening to listen to all this, and we've got another half an hour of the
303
00:22:07,520 --> 00:22:11,080
show left, so I'm hoping I can make it.
304
00:22:11,080 --> 00:22:15,320
The National Center on Sexual Exploitation has been around for a long, long time, but
305
00:22:15,320 --> 00:22:18,520
many people, and especially people in our audience who live in other countries, may
306
00:22:18,520 --> 00:22:19,520
not be aware of the center.
307
00:22:19,520 --> 00:22:21,640
Would you share its history with us?
308
00:22:21,640 --> 00:22:23,080
Sure, yeah.
309
00:22:23,120 --> 00:22:31,800
So, you know, I think it was in the fall of 1962, an anonymous individual placed sadomasochistic
310
00:22:31,800 --> 00:22:37,160
material on the corner, I think it was on the outside of St. Ignatius School on Manhattan's
311
00:22:37,160 --> 00:22:38,800
East Side.
312
00:22:38,800 --> 00:22:44,600
And what happened then, a 10-year-old child found this material and showed it to other
313
00:22:44,600 --> 00:22:46,200
school children.
314
00:22:46,200 --> 00:22:51,400
So the mother confiscated this material and she brought it to a meeting of mothers, and
315
00:22:51,440 --> 00:22:58,360
as a result, one of Nicosia's, you know, one of our first heroes here was Reverend
316
00:22:58,360 --> 00:23:04,480
Morton Hill, was asked by his superior to look into the matter, to see how widespread
317
00:23:04,480 --> 00:23:10,680
this issue of pornography is, and he then investigated and recruited other religious
318
00:23:10,680 --> 00:23:16,120
leaders to look into the issue of pornography.
319
00:23:16,120 --> 00:23:24,160
So over the years, you know, it started off initially known as Operation Yorkville in
320
00:23:24,160 --> 00:23:29,440
New York, and then, you know, the name went into morality in the media.
321
00:23:29,440 --> 00:23:35,560
And over the years, you know, it was, you know, it was kind of, the focus was, at its
322
00:23:35,560 --> 00:23:38,080
core, was the issue of pornography.
323
00:23:38,080 --> 00:23:45,680
But I think our former leaders and members of Nicosia soon learned that pornography is
324
00:23:45,680 --> 00:23:51,020
this one cog in a system of, as part of a web of sexual exploitation.
325
00:23:51,020 --> 00:23:58,160
So over this past six decades, you know, we've learned that none of these forms of sexual
326
00:23:58,160 --> 00:24:03,240
abuse or exploitation ever exists in his own private universe.
327
00:24:03,240 --> 00:24:09,000
Pornography doesn't, child-on-child sexual abuse doesn't, sex trafficking doesn't, prostitution
328
00:24:09,000 --> 00:24:10,000
doesn't.
329
00:24:10,000 --> 00:24:15,120
All of these in one way, shape, or form overlaps and informs the other.
330
00:24:15,120 --> 00:24:20,800
So over a period of six decades plus, you know, that's been our mission is to unravel
331
00:24:20,800 --> 00:24:25,560
and, you know, expose this web of sexual exploitation.
332
00:24:25,560 --> 00:24:28,720
And this all started in 1962.
333
00:24:28,720 --> 00:24:35,920
And then we became the National Center on Sexual Exploitation in about 2015, and made
334
00:24:35,920 --> 00:24:41,440
it very, very clear that our focus is the web of sexual exploitation.
335
00:24:41,560 --> 00:24:46,040
We have a variety of tactics in which we address this issue.
336
00:24:46,040 --> 00:24:52,560
And that has really been informed by decades of research experience, and also grappling
337
00:24:52,560 --> 00:24:58,040
with the mercurial nature of these issues, and obviously the overlap of modern society
338
00:24:58,040 --> 00:24:59,800
and technology.
339
00:24:59,800 --> 00:25:04,280
So those were the humble beginnings, Chris.
340
00:25:04,280 --> 00:25:07,640
And how has sexual exploitation evolved over those six decades?
341
00:25:07,640 --> 00:25:11,360
And how has the center evolved accordingly to take on the sex industry?
342
00:25:11,360 --> 00:25:12,360
Great question.
343
00:25:12,360 --> 00:25:13,360
So how has it evolved?
344
00:25:13,360 --> 00:25:20,040
I think what we are seeing now is I often speak about the, you know, the globalization
345
00:25:20,040 --> 00:25:21,880
of indifference.
346
00:25:21,880 --> 00:25:28,320
And unfortunately, you know, there's definitely been this surge in indifference, I would argue,
347
00:25:28,320 --> 00:25:29,320
and we see that.
348
00:25:29,320 --> 00:25:34,640
You could definitely draw these inferences from research as well, when online pornography
349
00:25:34,640 --> 00:25:38,440
exploded in the early 2000s.
350
00:25:39,440 --> 00:25:46,120
Where, again, when we talk about how people act out sexual objectification, we often see
351
00:25:46,120 --> 00:25:54,120
this in the media, we see this in modern culture, lyrics in music.
352
00:25:54,120 --> 00:26:04,040
So what we began to see is just this fusion of all things sexual objectification, obscene,
353
00:26:04,040 --> 00:26:08,240
and indifference, you know, because again, to our earlier conversation about what pornography
354
00:26:09,040 --> 00:26:15,520
does to the brain and how that informs real world behaviors.
355
00:26:15,520 --> 00:26:24,880
And that just, you know, that just create this conviction that we need to have a multi-pronged
356
00:26:24,880 --> 00:26:31,440
approach to do this, Chris, you know, so we combat the intersection of sexual exploitation
357
00:26:31,440 --> 00:26:33,520
through corporate advocacy.
358
00:26:33,520 --> 00:26:40,600
We do public policy, civil litigation, research, and then obviously also our parents' center,
359
00:26:40,600 --> 00:26:44,120
which we hope will launch later in this year.
360
00:26:44,120 --> 00:26:50,760
But again, you know, every one of those strategies, you know, the one informs the other, and,
361
00:26:50,760 --> 00:26:53,680
you know, we see that work.
362
00:26:53,680 --> 00:27:02,200
And ultimately, our main focus is the mass scale prevention of sexual exploitation.
363
00:27:02,200 --> 00:27:08,600
And it can be done by using these different prongs, you know, you can actually address
364
00:27:08,600 --> 00:27:14,120
the issue at all the different levels, and ultimately it comes down to a systemic addressing
365
00:27:14,120 --> 00:27:17,320
of the issue of sexual exploitation.
366
00:27:17,320 --> 00:27:20,480
You've mentioned some of the research the center does in the white paper.
367
00:27:20,480 --> 00:27:21,880
Where can people learn more?
368
00:27:21,880 --> 00:27:29,920
Oh, well, you know, all of this is on our website, endexploitation.org.
369
00:27:29,920 --> 00:27:30,920
Please go to that website.
370
00:27:30,920 --> 00:27:35,200
I mean, all our research is there, our publications.
371
00:27:35,200 --> 00:27:40,040
We have some amazing publications coming down the pike.
372
00:27:40,040 --> 00:27:46,000
Just some of our recent publications, you know, revolved around issues like demand reduction,
373
00:27:46,000 --> 00:27:50,400
the role of men in purchasing sexual access to women and children, how to end that.
374
00:27:50,400 --> 00:27:52,000
That was a federal study.
375
00:27:52,000 --> 00:27:55,400
We did that on a National Institute of Justice grant.
376
00:27:55,400 --> 00:28:02,600
And then we have a big, big research project that's launching in the next 35 days.
377
00:28:02,600 --> 00:28:04,520
And that's called Beyond a Fantasy.
378
00:28:04,520 --> 00:28:08,880
And it's on the issue of pornography, but also image-based sexual abuse.
379
00:28:08,880 --> 00:28:15,480
And then we veer into the issue of artificial intelligence and, you know, synthetic materials.
380
00:28:15,480 --> 00:28:21,880
So all of that is available on our website, the National Center on Sexual Exploitation's
381
00:28:21,880 --> 00:28:24,640
website, endexploitation.org.
382
00:28:24,640 --> 00:28:33,680
But our research also appears on academic platforms, Chris, like ResearchGate and academia.edu.
383
00:28:33,680 --> 00:28:34,680
So yeah.
384
00:28:34,680 --> 00:28:35,680
Great.
385
00:28:35,680 --> 00:28:36,680
Thank you for that.
386
00:28:36,680 --> 00:28:40,600
What are the biggest challenges the Center is currently tackling in the fight against
387
00:28:40,600 --> 00:28:45,120
sexual exploitation?
388
00:28:45,120 --> 00:28:50,360
I think some of the biggest challenges at the moment, I think I'm going to just touch
389
00:28:50,360 --> 00:28:57,960
on, again, on, you know, the globalization of indifference, you know, getting people
390
00:28:57,960 --> 00:29:05,080
to see what we've seen and observed and researched over the last six decades.
391
00:29:05,080 --> 00:29:14,600
And I think media, mainstream media, popular culture often is, and the message that people
392
00:29:14,600 --> 00:29:21,800
are bombarded with is totally antithetical to what we see every day in our work.
393
00:29:21,800 --> 00:29:29,400
And, you know, whether it's research, whether it is our civil lawsuits, you know, once you
394
00:29:29,400 --> 00:29:37,280
go into the rudiments of these cases, then you do realize that something like a brothel,
395
00:29:37,280 --> 00:29:42,600
which a lot of people globally, and there's also been a move locally in America to call
396
00:29:42,600 --> 00:29:48,640
for the decriminalization of prostitution, some segments, right?
397
00:29:48,640 --> 00:29:53,680
It's definitely not a widespread view that something like that needs to happen, but there's
398
00:29:53,680 --> 00:30:00,640
constant lobbying in every state for prostitution to become legal or decriminalized.
399
00:30:00,640 --> 00:30:07,600
And they often use these very kind of very romantic, hypholated language, and often,
400
00:30:07,600 --> 00:30:14,080
you know, the pretty woman movie scenario comes across where people can really do this
401
00:30:14,080 --> 00:30:19,920
and sell, you know, their bodies, and it comes at no harm, and that men are just kind of
402
00:30:19,920 --> 00:30:24,760
benign, you know, guys that's just doing what guys normally do.
403
00:30:24,760 --> 00:30:31,960
So there's all of these rape myths that is entirely, you know, there's a lot.
404
00:30:31,960 --> 00:30:37,920
So these two pieces just doesn't correlate or align.
405
00:30:37,920 --> 00:30:44,120
And so that is one of the biggest challenges is having people see from moving people from
406
00:30:44,120 --> 00:30:49,320
a no to a maybe, and from a maybe to a yes, this is a problem.
407
00:30:49,320 --> 00:30:57,200
And yes, I have a responsibility as a parent, as a husband, you know, as a religious spiritual
408
00:30:57,200 --> 00:31:03,280
leader, as a teen, not to entertain, not indulge in these things, because there are
409
00:31:03,280 --> 00:31:05,160
real world consequences.
410
00:31:05,160 --> 00:31:12,960
But you know, so we often find ourselves working against the stream of popular culture.
411
00:31:12,960 --> 00:31:22,600
And what is put out there, so it's a constant kind of these two irreconcilable ideological
412
00:31:22,600 --> 00:31:26,840
framings of what is good, what is right, and what is dignity.
413
00:31:27,000 --> 00:31:30,920
We are in the business of protecting human dignity at its core.
414
00:31:30,920 --> 00:31:39,000
And what the world and mainstream media often sends out is entirely conflicted to that.
415
00:31:39,000 --> 00:31:42,400
You've mentioned how the National Center on Sexual Exploitation takes a multi-pronged
416
00:31:42,400 --> 00:31:47,160
approach to combating sexual exploitation through policy advocacy, legal action, corporate
417
00:31:47,160 --> 00:31:50,880
accountability, survivor support, and public awareness.
418
00:31:50,880 --> 00:31:53,480
I always like to take these one at a time, though I might not have them in the order
419
00:31:53,480 --> 00:31:54,480
of importance.
420
00:31:54,800 --> 00:31:59,200
So first, policy and legislative advocacy, the Take It Down Act.
421
00:31:59,200 --> 00:32:03,640
How does the center work with lawmakers and government agencies to shape policies?
422
00:32:03,640 --> 00:32:04,640
Great question.
423
00:32:04,640 --> 00:32:10,680
So again, we are a nonpartisan organization, so I think that's often, you know, we believe
424
00:32:10,680 --> 00:32:17,000
that has really worked well over the last, you know, 60 plus years, Chris, is the fact
425
00:32:17,000 --> 00:32:27,320
that, you know, we go out of our way to make sure that we, you know, pursue policies where
426
00:32:27,320 --> 00:32:29,440
they are widespread buying.
427
00:32:29,440 --> 00:32:34,440
And that's just on the issue of nonpartisanship.
428
00:32:34,440 --> 00:32:40,960
The issue of the Take It Down Act is just such a great example of unanimous kind of
429
00:32:40,960 --> 00:32:47,880
buying from people on all sides of the political spectrum.
430
00:32:47,880 --> 00:32:56,120
The same with the CDA 230, the Communications and Decency Act, which really gives immunity
431
00:32:56,120 --> 00:33:04,040
to big tech platforms, you know, hosting and perpetuating this content online.
432
00:33:04,040 --> 00:33:08,760
You know, everybody on that, I mean, we were in the Senate Judiciary hearing when these
433
00:33:08,760 --> 00:33:17,320
big technology, you know, CEOs Mark Zuckerberg, X, Twitter, TikTok, when the parents came
434
00:33:17,320 --> 00:33:23,720
into the room and everybody had the pictures of their children, you know, who are 16 forever
435
00:33:23,720 --> 00:33:28,280
or 17 forever, you know, many of them died by suicide.
436
00:33:28,280 --> 00:33:34,520
And that's been very powerful, seeing how we can make a difference by getting people
437
00:33:34,520 --> 00:33:41,240
together into the same room and really listening to parents, you know, listening to those very
438
00:33:41,240 --> 00:33:47,720
real world stories of a child losing their life.
439
00:33:47,720 --> 00:33:54,400
So we try our very best to follow this approach and it's really worked well.
440
00:33:54,400 --> 00:34:01,000
So ultimately, from a public policy perspective, it's fighting bad laws and making sure that
441
00:34:01,000 --> 00:34:03,600
we are able to pass good laws.
442
00:34:03,600 --> 00:34:11,640
And the Take It Down Act is just one of those that we are very, very excited about, yeah.
443
00:34:11,640 --> 00:34:15,760
And how about the new plague that we touched on briefly before about sextortion?
444
00:34:15,760 --> 00:34:16,760
Oh.
445
00:34:16,760 --> 00:34:20,560
I recently read an article, I forget where it was, I think it was People Magazine, about
446
00:34:20,560 --> 00:34:25,160
how teenage boys are the ones with the highest suicide rate right now because they're getting
447
00:34:25,160 --> 00:34:30,800
lured in by people who come off as being a friend and this one particular 14 year old
448
00:34:31,600 --> 00:34:35,800
boy in Wisconsin, it happened with a matter of hours of them luring him in and just him
449
00:34:35,800 --> 00:34:37,920
taking his life.
450
00:34:37,920 --> 00:34:39,360
It's very, very sad, Chris.
451
00:34:39,360 --> 00:34:48,400
So just in a year, I think it's between 2023 and 2024, there was 150% increase in reports
452
00:34:48,400 --> 00:34:52,600
on the issue of sextortion.
453
00:34:52,600 --> 00:35:00,600
So yes, I mean, you know, what this fundamentally comes down to is that somebody, it's usually,
454
00:35:00,600 --> 00:35:06,840
let's say, a young girl befriending somebody on the other side of the line, which could
455
00:35:06,840 --> 00:35:09,520
be a teenage boy.
456
00:35:09,520 --> 00:35:15,480
But this girl isn't really a real person or a real girl, you know, there's a predator.
457
00:35:15,480 --> 00:35:19,920
And we've seen the research, we've worked with, I've personally worked with some of
458
00:35:19,920 --> 00:35:27,560
these criminal networks, you know, Nigerian criminal networks on the African continent.
459
00:35:27,560 --> 00:35:32,560
But more of the recent cases has directly been connected to some of these criminal networks,
460
00:35:32,560 --> 00:35:41,040
you know, so you would have somebody reaching out as purporting to be a 14 year old girl.
461
00:35:41,040 --> 00:35:46,120
There's a relationship starting and then there's a request from the girl, which is obviously
462
00:35:46,120 --> 00:35:53,000
the predator, the perpetrator, for the boy to send explicit images.
463
00:35:53,000 --> 00:35:55,480
Those images are then passed along.
464
00:35:55,480 --> 00:36:02,120
And as soon as that image is submitted, and as soon as that image lands in the, you know,
465
00:36:02,120 --> 00:36:08,160
the proverbial hands of the perpetrator, he makes himself known or not.
466
00:36:08,160 --> 00:36:13,120
But what happens is that's where there is a request for money.
467
00:36:13,120 --> 00:36:20,360
And we're not talking about small, you know, 10 or $20, I mean, $1,000, $3,000, $5,000.
468
00:36:20,400 --> 00:36:27,640
And if that money is not paid over, that these images will then be released to their friends
469
00:36:27,640 --> 00:36:31,000
and their connections.
470
00:36:31,000 --> 00:36:36,640
And we've seen, I mean, Paul Refill, he's an open source intelligence analyst, and he's
471
00:36:36,640 --> 00:36:40,120
done phenomenal work on the issue of sextortion.
472
00:36:40,120 --> 00:36:49,760
And there are several examples, Chris, where a connection with a user, a minor, let's say
473
00:36:49,840 --> 00:36:54,800
a teenage boy, 16 years old, on the one end of the phone, with that connection, let's
474
00:36:54,800 --> 00:36:58,880
say hypothetically at 11 o'clock, that connection was made.
475
00:36:58,880 --> 00:37:02,000
And at 11 o'clock, the report was built.
476
00:37:02,000 --> 00:37:05,200
Five minutes later, the image was sent.
477
00:37:05,200 --> 00:37:10,920
Literally within 20, within 30 minutes later, that child is no more.
478
00:37:11,080 --> 00:37:20,040
So there's so many examples out there of a death by suicide that happens within 20, 30
479
00:37:20,040 --> 00:37:24,280
minutes, 40 minutes, and it's tragic.
480
00:37:24,280 --> 00:37:27,520
You touched earlier by the legal action you're part of with the amicus brief in the Supreme
481
00:37:27,520 --> 00:37:28,520
Court.
482
00:37:28,520 --> 00:37:33,800
You've just touched on corporate accountability, but how about survivor support?
483
00:37:33,800 --> 00:37:34,800
How do you get engaged with that?
484
00:37:34,800 --> 00:37:38,360
You know, how do you connect with the survivors of sex trafficking and pornography, and what
485
00:37:38,360 --> 00:37:40,880
support do you provide?
486
00:37:40,880 --> 00:37:41,880
Great question.
487
00:37:41,880 --> 00:37:47,920
I mean, at the end of the day, we truly believe survivors and their voices and their lived
488
00:37:47,920 --> 00:38:00,080
experiences are extremely important, is unmissable when a targeted and well-informed policy and
489
00:38:00,080 --> 00:38:03,880
new laws are created.
490
00:38:03,880 --> 00:38:07,360
We are not a frontline service provider.
491
00:38:07,600 --> 00:38:16,320
We don't provide a frontline psychosocial services and support to survivors, but we
492
00:38:16,320 --> 00:38:21,120
do include survivors in every aspect of our work.
493
00:38:21,120 --> 00:38:29,440
From a policy perspective, at state level, testifying in support or against a specific
494
00:38:29,440 --> 00:38:36,160
act, we always have survivors at the front end testifying, and we provide that support.
495
00:38:36,160 --> 00:38:44,520
We assist with the briefing and enable survivors to get out there and to get into those very
496
00:38:44,520 --> 00:38:48,960
important platforms and arenas where their voices are to be heard.
497
00:38:48,960 --> 00:38:53,520
And obviously, with our law center, when we talk about our civil litigation, you know,
498
00:38:53,520 --> 00:38:57,920
we are representing survivors or Jane Doe's as we refer to them.
499
00:38:57,920 --> 00:39:06,720
You know, in the X-Videos case, we've got a case against X or Twitter that's still ongoing.
500
00:39:06,720 --> 00:39:13,160
We have the case against, you know, a brothel, two brothels in Nevada.
501
00:39:13,160 --> 00:39:18,720
We are involved with Andrew and Tristan Tate matter, these two influences, that's a separate
502
00:39:18,720 --> 00:39:21,440
conversation as well, very problematic.
503
00:39:21,440 --> 00:39:26,160
But in all of these cases, we have plaintiffs and we represent them.
504
00:39:26,240 --> 00:39:32,320
So that's how we work alongside survivors, representing them and making sure that they
505
00:39:32,320 --> 00:39:35,040
get justice that they deserve.
506
00:39:35,040 --> 00:39:39,600
And also in our research, you know, when we've completed a research study, you know, we would
507
00:39:39,600 --> 00:39:45,440
make sure to make that available to the survivor community to speak into that, you know, and
508
00:39:45,440 --> 00:39:48,160
to poke holes if holes needs to be poked into it.
509
00:39:48,880 --> 00:39:53,120
So it's very important, you know, survivors have that lived experiences and they, you
510
00:39:53,120 --> 00:39:59,200
know, they have that insider perspective that many researchers or advocates or policymakers
511
00:39:59,200 --> 00:40:00,000
simply don't have.
512
00:40:00,000 --> 00:40:04,240
And I always tell people, we all have blind spots and we all don't always know when we
513
00:40:04,240 --> 00:40:07,200
don't know and survivors helps us with that.
514
00:40:08,880 --> 00:40:10,320
And certainly there's public awareness.
515
00:40:11,040 --> 00:40:14,080
Appearing in Next Steps Forward raises public awareness, but we know that that's not enough.
516
00:40:14,800 --> 00:40:15,200
Yeah.
517
00:40:15,200 --> 00:40:17,600
What are your most effective public awareness channels?
518
00:40:17,600 --> 00:40:20,000
What messages do you want to stay with people?
519
00:40:20,960 --> 00:40:29,600
Well, the one that is forthcoming on Thursday and Thursday, we are launching our Dirty Dozen
520
00:40:29,600 --> 00:40:31,520
list, our annual Dirty Dozen list.
521
00:40:33,280 --> 00:40:39,520
And I think one of the first messages, I mean, at a meta level is to say that, man, a world
522
00:40:39,520 --> 00:40:43,520
free from sexual abuse and exploitation is possible.
523
00:40:43,520 --> 00:40:44,960
It's not a pipe dream.
524
00:40:45,360 --> 00:40:51,440
There are very real things that can be done and happen.
525
00:40:51,440 --> 00:40:56,800
And that's on all of these levels, legal, you know, we often talk about the PESTEL model.
526
00:40:56,800 --> 00:41:03,120
Now, PESTEL, it's P-E-S-T-E-L, that's usually the model that we use to do scenario planning.
527
00:41:03,120 --> 00:41:05,040
I've done scenario planning in my life.
528
00:41:05,040 --> 00:41:10,800
And basically what that denotes, you've got the political, the economic, the sociological,
529
00:41:10,800 --> 00:41:14,960
the technological, the economic and the legal environment.
530
00:41:15,760 --> 00:41:24,080
And that in every one of these spheres, that active citizenry is critical.
531
00:41:24,080 --> 00:41:28,400
And the issue of fighting sexual exploitation overlaps in all of these spheres.
532
00:41:28,400 --> 00:41:35,760
So and by addressing all of these spheres through civil litigation, public policy,
533
00:41:35,760 --> 00:41:41,440
corporate advocacy, research and our parent center, that you can fundamentally move the
534
00:41:41,440 --> 00:41:47,600
needle and on a mass scale level, a world free from sexual abuse and exploitation is possible.
535
00:41:47,600 --> 00:41:48,640
So it's not a pipe dream.
536
00:41:48,640 --> 00:41:53,360
So the first thing is for people to know, you can actually do something about this.
537
00:41:55,040 --> 00:41:56,400
This can really happen.
538
00:41:57,040 --> 00:42:04,720
And the other one is that people, parents have immense power in speaking into and hold
539
00:42:04,720 --> 00:42:07,680
accountable big technology platforms.
540
00:42:07,680 --> 00:42:16,240
We know that big technology and big tech, they don't fear law enforcement.
541
00:42:16,240 --> 00:42:18,480
There's very little that they fear.
542
00:42:19,200 --> 00:42:22,880
They don't self-police because nobody's really holding them accountable.
543
00:42:22,880 --> 00:42:29,680
But what they do fear, Chris, is when that day arrives, and it's going to come, where a single
544
00:42:29,680 --> 00:42:37,760
parent can step into a court and provide the evidence of platform A, B or C or D that was
545
00:42:38,480 --> 00:42:48,320
directly, indirectly complicit in a situation of harm that befell a minor and that hypothetically
546
00:42:48,320 --> 00:42:52,000
or real in the case of sex torture ultimately led to somebody's death.
547
00:42:52,880 --> 00:42:56,240
We are not talking about thousands or hundreds of thousands of dollars.
548
00:42:56,240 --> 00:42:59,600
These are going to be multi-million dollar claims.
549
00:42:59,600 --> 00:43:03,360
And I think that is something that big tech fears.
550
00:43:03,360 --> 00:43:05,280
And it's not just going to be one parent.
551
00:43:05,280 --> 00:43:09,520
There are going to be countless parents that's going to be empowered to do this.
552
00:43:09,520 --> 00:43:14,000
So the message there to parents is everybody can do something.
553
00:43:14,000 --> 00:43:21,280
You know, active citizenry, you know, be involved with signing these sign-ons and these petitions.
554
00:43:21,920 --> 00:43:28,560
And that's why we are there to hold parents' arms up in their respective journeys and then
555
00:43:28,560 --> 00:43:36,240
grappling with navigating this very, very complex landscape of technology, policy, law,
556
00:43:36,240 --> 00:43:38,800
and again, the globalization of indifference.
557
00:43:38,800 --> 00:43:42,640
Because you need people that sincerely care and that sincerely
558
00:43:42,640 --> 00:43:46,800
vested in bringing about this world that we truly believe in.
559
00:43:48,080 --> 00:43:50,880
You know, and to that point, in today's world, everyone's busy.
560
00:43:50,880 --> 00:43:52,720
Everyone's tugged in a thousand different directions,
561
00:43:53,360 --> 00:43:55,760
but you have to be involved in your children's lives.
562
00:43:56,960 --> 00:44:00,640
Check their phones, check their tablets, whether they're asleep or not,
563
00:44:00,640 --> 00:44:04,400
but make that part of the policy just so you know exactly what's going on to hopefully prevent
564
00:44:05,040 --> 00:44:07,600
some of these sex torture and other types of tragic events.
565
00:44:08,960 --> 00:44:10,960
A moment ago, you mentioned the Dirty Dozen list.
566
00:44:12,000 --> 00:44:13,600
Who is typically on that list?
567
00:44:13,600 --> 00:44:15,280
And are there repeat offenders on there?
568
00:44:15,840 --> 00:44:22,560
Yeah, I mean, you know, we've, again, we've got an entire page for the Dirty Dozen list,
569
00:44:22,560 --> 00:44:26,800
you know, and, you know, everybody's been on the list pretty much, you know,
570
00:44:26,800 --> 00:44:27,840
all technology companies.
571
00:44:27,840 --> 00:44:37,120
I mean, Meta has been on there more than once, you know, Apple, Spotify, TikTok, Cash App.
572
00:44:37,760 --> 00:44:46,080
So, you know, and these are all platforms that we show through actual evidence-based research,
573
00:44:46,960 --> 00:44:53,520
how they enable and perpetuate sexual exploitation on their platforms.
574
00:44:53,520 --> 00:45:00,000
I mean, you know, just one example, and that's also where the Take It Down Act comes in,
575
00:45:00,000 --> 00:45:08,800
you know, where an example would be where, you know, non-consensual image-based sexual abuse
576
00:45:08,800 --> 00:45:17,760
content has been uploaded by a third party on a platform and where a victim who, you know,
577
00:45:17,760 --> 00:45:25,280
who fell victim to that illegal content being uploaded on a platform has tried for months and
578
00:45:26,000 --> 00:45:28,400
even years on end to get that off Chris.
579
00:45:29,120 --> 00:45:35,280
And with, you know, no solution, you know, and literally having to live
580
00:45:36,160 --> 00:45:41,200
knowing that I've actually got a response from some of these platforms saying, well, it doesn't,
581
00:45:41,760 --> 00:45:51,360
you know, it's not against our, you know, our regulations and laws as a company.
582
00:45:51,440 --> 00:45:57,520
So there are some of these examples, and so you can imagine the harm, you know, and the trauma,
583
00:45:57,520 --> 00:46:02,240
it's like this digital tattoo that you carry, and that's where the Take It Down Act will now come in.
584
00:46:02,800 --> 00:46:09,040
And that's about, you know, first and foremost, you know, age verification plays into that,
585
00:46:09,040 --> 00:46:12,800
but also that this content must be removed within 48 hours.
586
00:46:14,560 --> 00:46:20,800
So, yes, but some of the successes we've had over the years, I mean,
587
00:46:20,800 --> 00:46:29,120
those include, we've had transformative victories with Google, Netflix, Yilton worldwide, Verizon,
588
00:46:29,920 --> 00:46:34,240
Walmart, and including the US Department of Defense, you know.
589
00:46:34,240 --> 00:46:39,760
So, you know, the Dirty Dozen list has really had a phenomenal impact.
590
00:46:39,760 --> 00:46:46,800
And just to give you a heads up, this year, we are probably addressing one of the biggest,
591
00:46:46,800 --> 00:46:54,000
biggest giants that if this giant falls, you know, the echo and the reverberation will be felt
592
00:46:54,880 --> 00:46:57,040
globally, but it's a big one.
593
00:46:57,040 --> 00:47:02,480
So I hope some of the viewers and the listeners could join us on Thursday.
594
00:47:03,360 --> 00:47:05,360
And that's endexploitation.org.
595
00:47:06,160 --> 00:47:10,720
That's endexploitation.org, yeah.
596
00:47:10,720 --> 00:47:18,960
And, you know, you can just say NCOSE, N-C-O-S-E, Dirty Dozen list 2025, and it will,
597
00:47:18,960 --> 00:47:22,320
you'll definitely be directed to the website.
598
00:47:23,360 --> 00:47:25,760
So I know that the Take It Down Act has passed the Senate.
599
00:47:26,320 --> 00:47:27,600
Now it's back in the Congress.
600
00:47:28,800 --> 00:47:30,080
Do we expect it to be passed?
601
00:47:30,080 --> 00:47:32,560
Is President Trump going to see this anytime soon?
602
00:47:32,560 --> 00:47:35,040
Who are the cheerleaders on the Hill in both chambers?
603
00:47:35,520 --> 00:47:44,960
Oh, I mean, we are just in awe of Senator Cruz and Klobuchar that championed this.
604
00:47:45,840 --> 00:47:49,120
So massive kudos to them, Chris.
605
00:47:49,120 --> 00:47:55,680
But yeah, I was in the room about three weeks ago when the first lady, Melania Trump, me and
606
00:47:55,680 --> 00:48:03,600
our VP, Dr. Eleanor Gaytan, we were invited to be in the room where she briefed the media.
607
00:48:03,600 --> 00:48:06,160
There was obviously more media in the room than anybody else.
608
00:48:06,160 --> 00:48:14,960
But we were there when she gave her vote of support for the Take It Down Act.
609
00:48:14,960 --> 00:48:25,120
And she did a phenomenal job in just laying out this harm and just the mean-spiritedness
610
00:48:25,120 --> 00:48:26,400
of what is out there.
611
00:48:26,960 --> 00:48:35,120
On both her left and to her right-hand side, she had 10 survivors of image-based sexual
612
00:48:35,120 --> 00:48:41,360
abuse that each in a very, very courageous way shared their own experience with this
613
00:48:41,360 --> 00:48:48,800
illicit content being non-consensually uploaded on these platforms and just the incredible
614
00:48:49,360 --> 00:48:51,040
trauma that goes along with that.
615
00:48:51,040 --> 00:48:53,840
So she almost took in this mother figure.
616
00:48:54,800 --> 00:49:00,720
She was incredibly articulate in how she laid out the problem, but at the same time, just
617
00:49:00,720 --> 00:49:07,440
her generosity of spirit just shone through in the way that she encouraged just these
618
00:49:07,440 --> 00:49:11,600
four young women, young teens that were sitting next to her.
619
00:49:11,600 --> 00:49:13,440
So she's a hero in all of this.
620
00:49:14,480 --> 00:49:20,800
And I think, Eleanor, one of our public policy people mentioned she can't remember ever a
621
00:49:20,800 --> 00:49:30,160
first lady standing up and speaking and giving her support so strongly for a specific piece
622
00:49:30,160 --> 00:49:34,880
of legislation that we hope and believe will pass this year.
623
00:49:36,160 --> 00:49:39,520
We mentioned briefly before the show that people think this doesn't happen in their
624
00:49:39,520 --> 00:49:43,440
neighborhood, that it's only in the shadows, it's not going to affect them.
625
00:49:44,880 --> 00:49:47,840
But how do these people know where to find the sex traffickers right there in their own
626
00:49:47,840 --> 00:49:48,400
communities?
627
00:49:49,280 --> 00:49:51,680
Again, that's where the technology comes in, right?
628
00:49:51,680 --> 00:49:58,160
We know that sex buyers, people that purchase sexual access to women and children and to
629
00:49:58,160 --> 00:50:04,880
other men as well, the way they communicate, there's always an online platform through
630
00:50:04,880 --> 00:50:11,280
which they are able to reach traffickers, pimps, and we all have mobile phones as well.
631
00:50:12,240 --> 00:50:20,800
So it's a criminal ecosystem that allows these crimes to flourish.
632
00:50:20,800 --> 00:50:30,880
But again, just to your point, what is very clear is sex traffickers intentionally offers
633
00:50:30,880 --> 00:50:37,520
children and women to be raped at a price and men, those who purchase sexual access,
634
00:50:37,520 --> 00:50:40,400
intentionally pays to rape.
635
00:50:41,600 --> 00:50:47,840
Women and children, they care precious little in terms of who and how they harm people.
636
00:50:49,600 --> 00:50:56,640
And usually there is an online connection and people talk, people self-organize and
637
00:50:56,640 --> 00:51:01,760
they create these criminal economies, which in their own way self-organize.
638
00:51:01,760 --> 00:51:09,600
But one of our biggest strategies here is the issue of consumer level demand.
639
00:51:09,600 --> 00:51:13,520
And we always say without sex buyers, sex traffickers cannot exist.
640
00:51:13,520 --> 00:51:20,880
It's sex buyers who creates the market forces and it fuels the market forces that creates
641
00:51:20,880 --> 00:51:22,480
the roles of traffickers.
642
00:51:22,480 --> 00:51:29,680
So without a man purchasing sexual access to a woman or child, without that person,
643
00:51:29,680 --> 00:51:31,920
a trafficker cannot exist.
644
00:51:32,000 --> 00:51:41,120
And every sex trafficking act has been paid by a dollar from a male sex buyer.
645
00:51:41,120 --> 00:51:46,960
So I always say sex buyers are the critical missing link in this equation.
646
00:51:49,120 --> 00:51:51,040
Marcel, we only have a few minutes left.
647
00:51:51,040 --> 00:51:54,640
Take us to the end of the show with some words of wisdom and how we can help support your
648
00:51:54,640 --> 00:51:58,240
cause and end this devastating crisis.
649
00:51:58,960 --> 00:52:04,080
Well, I think firstly, I mean, please do follow us on social media.
650
00:52:05,840 --> 00:52:11,600
You know, we are on X, Nekosi is on LinkedIn as well.
651
00:52:11,600 --> 00:52:13,440
They can follow me on LinkedIn as well.
652
00:52:13,440 --> 00:52:19,840
I am on there and I regularly post weekly about the work we do and always take action.
653
00:52:19,840 --> 00:52:25,120
I mean, on our website, again, indexrotation.org, you know, we always have an action page.
654
00:52:25,600 --> 00:52:31,840
Whatever the issues are, if you're going to click on the issue of prostitution or pornography
655
00:52:31,840 --> 00:52:35,760
or child on child sexual abuse, there's always an action page.
656
00:52:35,760 --> 00:52:41,520
It will take you literally 15 seconds to complete one of those actions.
657
00:52:41,520 --> 00:52:43,840
Please do take those actions, but also take hope.
658
00:52:43,840 --> 00:52:49,760
I truly believe, Chris, there's a sleeping giant of common sense that's waking up finally.
659
00:52:50,720 --> 00:52:55,920
And I think when that giant of common sense wakes up, it's going to dispel a lot of rape
660
00:52:55,920 --> 00:52:56,320
myths.
661
00:52:56,320 --> 00:53:02,480
It's going to dispel a lot of ideas in terms of what masculinity is, what fatherhood is,
662
00:53:02,480 --> 00:53:03,600
what leadership is.
663
00:53:04,160 --> 00:53:11,440
And it's also going to dispel this idea that women in pornography enjoys the sex that people
664
00:53:11,440 --> 00:53:14,720
see from the outside looking in, which is not the case.
665
00:53:14,720 --> 00:53:20,080
So I just think that's kind of the very broad framing that I'm using of common sense that I
666
00:53:20,080 --> 00:53:23,120
truly believe will eventually set in.
667
00:53:23,120 --> 00:53:30,240
And I think we could expedite the rousing of that sleeping giant by everybody literally
668
00:53:30,240 --> 00:53:33,680
taking action, you know, on our website, following our social media.
669
00:53:34,560 --> 00:53:36,560
So, yeah, there's hope.
670
00:53:36,560 --> 00:53:38,000
It's not a pipe dream.
671
00:53:38,000 --> 00:53:39,040
This can happen.
672
00:53:39,760 --> 00:53:41,120
Hear that, everybody listening?
673
00:53:41,120 --> 00:53:41,840
Get on the website.
674
00:53:41,840 --> 00:53:43,360
It's time to wake that sleeping giant.
675
00:53:43,600 --> 00:53:47,360
Dr. Marcel Van Der Watt, thank you so much for the important work that you do and really
676
00:53:47,360 --> 00:53:48,160
thank you for being here.
677
00:53:49,040 --> 00:53:49,760
Thank you, Chris.
678
00:53:49,760 --> 00:53:50,560
Thanks for having me.
679
00:53:51,280 --> 00:53:52,640
I'm Chris Meek, run of time.
680
00:53:52,640 --> 00:53:53,600
We'll see you next week.
681
00:53:53,600 --> 00:53:54,960
Same time, same place.
682
00:53:54,960 --> 00:53:58,320
Until then, stay safe and keep taking your next steps forward.
683
00:54:03,040 --> 00:54:06,080
Thanks for tuning in to Next Steps Forward.
684
00:54:06,080 --> 00:54:10,320
Be sure to join Chris Meek for another great show next Tuesday at 10 a.m.
685
00:54:11,280 --> 00:54:13,120
Pacific Time and 1 p.m.
686
00:54:13,120 --> 00:54:16,960
Eastern Time on the Voice America Empowerment Channel.
687
00:54:16,960 --> 00:54:21,040
This week, make things happen in your life.
688
00:54:40,320 --> 00:54:41,360
you