🎂 Julie Cordua & Joe Sullivan Fireside Chat
Presented by: Joe Sullivan, Julie Cordua
Originally aired on March 29, 2022 @ 9:30 PM - 10:00 PM EDT
2021 marks Cloudflare’s 11th birthday, and each day this week we will announce new products and host fascinating discussions with guests including product experts, customers, and industry peers.
In this Cloudflare TV segment, we will have a fireside chat between Joe Sullivan (SVP, Chief Security Officer of Cloudflare) and Julie Cordua (CEO of Thorn).
Find all of our Birthday Week announcements and CFTV segments at the Birthday Week hub
English
Birthday Week
Transcript (Beta)
Hi, everyone. Welcome to Cloudflare TV. My name is Joe Sullivan. I'm the Chief Security Officer here at Cloudflare, and I'm happy to be the host of this session.
It's Friday of our birthday week here at Cloudflare.
This is our week where we try and focus on celebrating, giving back to the Internet, understanding where the Internet and thinking about where the Internet should go, what role we all play in making it a better place.
Our mission at Cloudflare is to help make a better Internet.
I'm really happy to have as my guest here today, Julie Cordua, who's going to talk a little bit about one of the challenging areas in terms of the Internet and Internet safety.
Julie, why don't you take a moment to introduce yourself to the audience?
Yeah, thanks. Thanks for having me, Joe. I am the CEO of an organization called THORN.
We build technology with a goal in mind of ending online child sexual abuse.
Our focus is on, and maybe we'll get into this, but near term, really the elimination of child sexual abuse material, which in the law is called child pornography, but throughout this interview, you'll hear me not use that term because it isn't pornography.
It's like the farthest thing from pornography.
It's actually the documentation of the rape and sexual assault of children.
Our goal is to eliminate that from the Internet and then more broadly create an Internet that doesn't have forms of child sexual exploitation in general because we're seeing emerging crimes like online grooming leading to self-production of CSAM and other new emerging crimes we can talk about.
We're a nonprofit that looks very much like a small software company. We build technology that both serves, we have some that serves law enforcement, some that serves industry to try to mobilize those communities to do more and use technology and data in a smart way to end this crime.
I've known about and worked a little bit with Thorn myself over the years since really the inception of the organization before it was even called Thorn.
I remember when you joined and took over the leadership and it's been amazing to see what you've done with this nonprofit since then.
Let's talk for a minute about, before we dive into the challenge of the problem, let's talk a little bit about you and your journey.
Did you expect anything what this has been when you started? No and I never would have expected this in my career either.
I wanted to be in technology but when I started my career that looked like wireless technology.
I worked at Motorola when they were at the top of the game for cell phones which seems like eons ago.
I went to a startup wireless company after that and then when it got into social enterprise with the red campaign which looked at how do you bring the private sector together in a meaningful way with an issue that was about using the marketing prowess of companies to build products that would fund AIDS medicine in the developing world.
Then I did that for about five years and met the founders of Thorn who are Ashton Kutcher and Demi Moore.
I met them and they were just starting on their philanthropic journey and they said we want to hire someone to help us figure out what this is going to be.
Actually when I started they were looking at child sex trafficking and the role of technology in child sex trafficking.
I didn't know anything about child sex trafficking and I definitely didn't know anything about where we're at right now which is this broader world of online child sexual abuse material and exploitation but I did know about technology and I knew about public private partnerships and how you mobilize kind of the talent that is in a private sector to address the social issue.
That was about 10 years ago I chose to join them and help craft what their their philanthropic investment would look like.
In our first few years we really thought we honed in pretty quickly on the gap in the field which was technology was dramatically changing the landscape when it came to child sexual abuse and there was no concentrated effort to use technology to be a part of the good side of the solution and so that's where we thought we would fit.
We really thought when we started that we would just be kind of an instigator, an innovator, a light shiner.
We'd research the issue, we'd bring people together and then as we dove in deeper and deeper we realized that one the the issue was exploding exponentially and just shining a light and just kind of motivating trying to motivate people wouldn't work because actually the incentive structures in this field were misaligned for anyone to take initiative and to really build the technology systems that would be needed if we were going to believe that there should be an Internet without child sexual abuse and so about four years in we pivoted to kind of the direction we're headed now but I had to learn the issue, I had to learn what it was like to work at a non-profit and all of that was hard.
I'm not quite sure what was more difficult learning about a deep dark issue that most of the world doesn't know about or learning how to build a company.
Right, yeah but I remember the journey that you took and it impresses me that well one of the things that impressed me about Thorne is it's a tech organization.
You build product and that's so different from most non-profits in that you're not just convening to think about issues, you're actually trying to do something technically.
How hard was it to make that decision and how do you actually get tech talent on your team?
I think as you were talking it really struck me why it wasn't hard to make that decision was because when so we were doing a lot of research and trying to figure out like what technology should be built, how technology could be used and we would do this and we would build prototypes then we would go to either a law enforcement entity if it was about identifying a victim or we'd go to a company if it was about taking the content down and we'd say here's an innovation you could use and again there wasn't an incentive structure to actually like build that right.
So for a company their job is to always prioritize on well generally prioritize on like what's going to drive the business metrics right like the profit the growth not what's going to maybe take content off because it's illegal content or close accounts that's not usually where the investment's made or with law enforcement if there isn't a huge investment in new technologies in this space and so as soon as you realize like there were a few cases early on that were brought to our attention we don't work cases but people would come to us and say say can you help us find this child or this girl is being you know assault sexually assaulted on video every week for a year we see it but we don't know where she is do you have any technology that could help once you know that and you realize that you do have ideas that could help but there's no one to productize it that makes the company that makes the answer really clear we went to our board five years in and we said we have all these ideas but if we want to be an organization that actually delivers impact we actually have to productize this stuff because there's no one else who's willing to make the investment and so that was about five years ago in 2015 that we decided consciously to become a product organization and so the second part of your question of how we attract talent you know the first few years was more difficult because you try to attract talent out of the private sector and you say we're a non-profit I mean that's actually not the way to start we're a non -profit but we got a few people who said you know I want to take a break I want to come try to do something really challenging and then as soon as we had products that really demonstrated impact and we also secured some significant funding from the audacious plant we had some prototypes that we were able to get funded to scale them over five years and that we started to build a reputation for a place where engineers could come and the problems we're solving are really hard and we know engineers love hard problems so when you can demonstrate that we've actually been able to build a pretty good kind of brand as a place that people who want to do some of the most meaningful technical work of their lives can come and do and now we're almost 100 people with more than half of our organization engineers and data scientists who are building building this our products.
That's amazing.
Let's take a minute to talk about the why this matters you know the extent of the problem like you've kind of acknowledged a couple of times already in our conversation that most of the world that uses the Internet doesn't know about this dark awful side of it.
I know about it because I started my career with the department of justice and I actually in the 1990s used to prosecute some of these cases and then working at the different tech companies that I've been at have had to deal with the fact that our platforms would get abused for purposes of facilitating this because it's because there are just people out there who are trying to push it but like for me there have been a couple of stories that have really stuck with me forever and motivated me to want to make a difference.
Are there any particular ones that come to mind for you?
Yeah I mean I think well first the first part of the question was just like why does it matter and even what is it?
So I spoke a little earlier about when we talk about child sexual abuse material most people think of it as child pornography and you know before the Internet people would trade it through the mail it's really hard to produce it you have to have a you know a lab where you produce the images but as soon as the Internet came along and you could share an image or a video with the push of a button and how quickly all of the technologies have developed to make that even easier the market for this the spread of this material just like exploded right for all the reasons that many other communities exploded online all of a sudden you could connect with people all around the world who had interests like you did so you actually see today almost every platform in the world that has an upload button being used to upload child sexual abuse material it is shared over email it is shared over chat there are groups that form and they trade on open web platforms also on dark web platforms literally every element of the Internet where you can trade an image or a video you will most likely find this content and what it has done is a couple things one it has actually increased the market for images and videos so think of that like the people who have access to children that who are abusing if they belong to a group that has you know maybe on some of these open web platforms hundreds of members who say hey um you know username xyz i haven't seen many pictures of your little girl today when you're going to drop a new series right that incentivizes a new situation where more images and more videos are produced and then it normalizes the behavior so someone maybe who felt alone and ostracized and wouldn't have felt normalized now has a whole community of people who tell them it's okay and then the second part of this is that when these children are identified and removed from harm their content lives on for the rest of their lives and so we know children today who are trying to grow up who when their abuse was published it was published with their name and their city and they have since had to you know renew their change their identity multiple times their family has had to move homes multiple times and they actually we know others who have been identified in public by someone who has seen their material and for some of these kids who you know are trying to live their images and videos circulate on the platforms we use every day again anywhere with an upvote button facebook twitter you know instagram everywhere they get reported to the national center for missing exploited children if a tech company finds an image they have to report it one child can have you know over the course of their lifetime their image reported upwards of 500,000 600,000 times and that's only what has been found not and and very few companies are actually looking and i can get into more of the details around this but it's the the the law in the u.s is an interesting one in that tech companies are not required to look for it but they're required to report it if they find it and most companies don't proactively look for it so even though we know that last year about 70 million images and videos were reported from u.s tech companies to the national center it's like a tiny little sliver because you have only a handful of companies that actually have deployed proactive detection mechanisms um and so that it that's just a scary thought to think you know how much more is there that we haven't even found right yeah it's funny um we talked about this a while back when i was a cso at facebook we implemented photo dna and we scanned every single photo that went through our system and that um as a result has led to facebook you know statistically looking like the worst of the worst in terms of the nickname reports because facebook set up to proactively look at and detect this abuse um and it seems wrong that the company gets called out negatively for the volume when in fact the only reason there's awareness of the volume is because they chose to look for it right and we don't want to punish companies that go look for it but we want to thank them for it yeah i mean i think that's that's part of what we're trying to do is just shift this culture that one we should talk about this issue it's a it's a deep dark issue that no one wants to talk about because it's really painful and sad and hard but um every single place there's an upload button it will be used for child sexual abuse material so if you're building something just know that it's not your fault you didn't create a place that attracts abusers there's abusers everywhere um though i do believe it is your responsibility then to know that and take action to ensure that your platform isn't being abused in this way and so we're trying to shift this culture to say good citizenship as a tech company is to proactively detect and report um and you know the the questions that we're asking right now though are so for instance you just brought up facebook so facebook is currently the largest reporter of child sexual abuse material of anyone but that's because they look for it they have put in some of the best detection mechanisms in place um but they have also committed to encrypting messenger and when they do that the the line of sight on the bulk of child sexual abuse material that's reported today will go away right we won't we won't see it and those kids will not be identified which um which brings squarely in to focus the you know the dialogue that apple has faced in the last couple of months is they chose to um try and you know they announced that they were going to implement a solution to try and detect images being sent um uh through phones that they produce and they they have since announced that they're going to delay any kind of roll out of that because they got a lot of pushback from the privacy community and you know it shows that there's a there are a lot of challenges you know and competing interests here that they're all uh in this case arguably for the benefit of consumers and so these aren't easy um questions i know you've probably uh been called into quite a few forums to kind of talk about the apple situation we'd love to hear your take on it yeah i think um i mean one of the reasons i enjoy speaking with you is because you do understand the nuance of these of what we're talking about of and and we try to as well as an organization you know privacy is incredibly important it is important to everyone um we also have you can say fortunate or unfortunate we have also seen child sexual abuse online you know some of us have actually seen the images in an investigation right and and that most people have not and so we also will say that we believe that the privacy rights of that child the one who was raped on video and it was distributed for the rest of the world to enjoy that child actually has privacy rights as well and so we have to have a pretty nuanced conversation to say how are we going to hold of value the right for us to have privacy online and the right for that child whose worst days of their life was documented and now spreads across the Internet and so that was the line that apple was trying to hold um and this is you know decisions that companies have to make all the time what type of platform or what are going to be the principles that they will run their company by and today apple has been all you know more the the traditional privacy definition for quite a while and then they introduced a few solutions and i just want to clarify what solutions they introduced because they had one solution in iMessage that is not about child sexual abuse material detection it was about kind of a nudity detection it was limited to an opt-in for a family plan where a parent for a child under 13 could turn this on where if a child receives an image that is believed to have some form of nudity not necessarily child sexual abuse material something else the child can just have take a minute to decide whether they want to see it or not and then if they do decide to receive or the image or send it the parent would be notified and that is trying to address the issue of grooming where a perpetrator may be grooming someone to send naked selfies um or something to that effect this detection of child sexual abuse material was happening specifically where apple has said they made a call they said we do not want to be a company that hosts child sexual abuse material in our iPhoto cloud um service it's actually already in their terms of service it's been there probably from day one that says it's illegal for for you to do that and we have every right to search and look for it and close your account if you do it but they hadn't actually implemented any system to actually adhere to that policy and so the technology they put in place would um do an operation on the device but then actually not do anything with that information until you try to move your information onto a uh you know the platform that apple owns which is iPhoto cloud or iCloud photo um I get those wrong sometimes and if you try to then go store illegal child sexual abuse material on their platform they will detect it and and that was what they were trying to do and and there was the whole argument that like that's a step too far like you cannot I you know you're not looking at my images right now I don't want you looking at them ever and we would argue that no if a company is giving you a service and and they can preserve your privacy to a certain degree I mean this is one of the most privacy forward detection mechanisms ever introduced um sorry that was my reaction too when I first when I first heard it I thought wow apple is going way further than all the other companies that have implemented detection so far because they I think they intentionally wanted the the analysis to be done on the device rather than in their cloud so that it would be more privacy preserving right for the vast majority of cases where there's going to be nothing detected before an upload right and and the the process that's done ensures that the only thing that will be detected is actually known child sexual abuse material so it's not making a prediction on your imagery it's not looking for a it is we have a list of known child sexual abuse material that you know the children have been identified it is definitely illegal most of these kids have already been are in recovery and we have as a society have said that we don't believe that the documentation of their rape and assault should spread all over the Internet companies have access to this list you implemented this at facebook and all they're saying is that we're going to just make sure that any image you're trying to upload doesn't match that list it's quite precise um but it was a conscious decision right like that this is a step by a one of the largest companies out there to say we're going to value the safety of children and the privacy of child victims as much as we value the privacy of all of our users and there will be a little bit of trade-off in there it's not black and white this is definitely a gray area but it's worth the conversation to figure out like how do we hold true the privacy of these children who are whose imagery um and abuse you know circulates forever for the rest of their lives right yeah and so for the benefit of people listening who um aren't as familiar with technology the way um we are it the companies that try and do this um screening they the way it works is you get a list of hashes so it's just basically a series of numbers and then if you run the right algorithm against all of the photos they you create a hash of the photos and and then it's just a matching and so if if a hash of a known exploitation image is detected then it can be blocked or reported but it's not actually like the technology is not actually looking at all your photos doing facial recognition or anything like that it's actually a lot more primitive I guess if you will right and and that um you know what apple's introducing is quite basic it was image hashing against known child sexual abuse material but it was it's a great first step so we're going to have to as a society figure out if and then how we will detect child sexual abuse material as more and more platforms go to private environments right and and this was a good a good first step to see how can we do that um but when we look at open platforms today you know like I most of them don't do even the basics um and that's why we introduced like one of our biggest products right now is safer which is for these companies to make it turnkey we can you know you can hash and match image and video and we've also introduced the ability to um of a classifier to detect possibly never before seen child sexual abuse material but that's more easily just um deployed in open environments where you there is not an assumption of this encrypted privacy um and there's still a ton of work to be done there I mean most most companies haven't deployed that type of detection yet for sure yeah and we've also tried to put out a little product that our customers can use at Cloudflare to do something similar or basically do exactly the same thing yeah and um from a um from a public dialogue standpoint how do we how do we get a better debate about these issues because it doesn't seem like most people understand the scope of the problem or how tragic it is right um I mean one of the reasons why most people don't understand is because when there's other atrocities or other crimes like this usually you will have a victim those who are impacted come forward right they will share what has happened to them whether it is people who've escaped in a oppressive regime or a journalist who's been silenced at some point they come out and we hear the story we hear the implications a survivor child of sexual abuse the last thing that they want to do is go put their face up in front of congress or on a national television show there they have unwillingly had the worst days of their life broadcast to the world and so first we have to figure out how to draw attention to the issue without re -victimizing these kids the other thing is we're only 20 years into this and the majority of children who've been identified are under the were under the age of 12 when they were abused we're talking babies and toddlers and so they're actually just getting to the place I mean many of these kids may still be teenagers and like the early children who were abused 20 years ago are young adults and so it's not like we've had generations of people abused we have people who have been abused their content is still in circulation and they want to just try to go live a normal life and so figuring out I don't know that we've solved it like how do we bring to light this issue and the the harm it continues to deliver without re-victimizing and I also think when innovators are building like the next generation of entrepreneurs and builders and engineers that this should be something that is built into the DNA of how we build instead of having to kind of clean up later after we've built platforms and not thought about the worst way it could be abused right all right that was a great kind of wrap up I don't want to we're almost out of time so let me just finish by saying thank you for coming on to talk about this thank you for being a champion for this issue and thank you for Thorn being a tech organization that's doing something with technology it's only through having the tech companies having partners like Thorn who can push us from a technology standpoint to think about actual solutions that will get better and so really grateful that you and Thorn are out there championing these issues for us for people who want to learn more I recommend they check out your website but also the TED talk that you gave a couple of years ago there's a lot more people can learn on this topic so thank you so much for your time yeah well I want to thank you because you were one of the first people I met 10 years ago when I did not know what I was getting myself into and over the years you've been a good person for me to call and you've challenged me each step of the way which is what we need if we're going to find the right solutions in this space so and thanks for giving us the platform today.