As technology advances, so do the risks faced by children, making it essential to prioritize their protection. Online platforms provide predators with unprecedented access to potential victims, allowing them to exploit, groom, and harm children from the safety of their screens.

It is essential to continuously raise awareness, educate children and adults about online risks, and promote responsible digital citizenship. By strengthening collaboration between governments, law enforcement agencies, and technology companies to develop and enforce robust policies, laws, and regulations, we can create an environment where children can explore the digital realm without the risk of falling victim to abuse.

John Robb
Welcome, everyone, and thank you for joining us on our podcast Inside the Sweeps. This week, I’m joined by Eleanor Linsell and Alessandra Tranquilli of WeProtect Global Alliance. Now that you’re with us, can you tell us a little bit about the important work that your organization does and what your role is in helping within your organization achieve its goals?

Alessandra Tranquilli
Hi and thanks for inviting us to share the work that we do. So, we all know that the Internet has changed the way we communicate, but it’s also important to know that it wasn’t created with child safety in mind, there’s a lot of child sexual exploitation and abuse online, which basically means any form of sexual exploitation and abuse that is facilitated by technology and the digital world. And this is becoming one of the most urgent and defining issues of our generation, especially also with the COVID pandemic, which meant a lot of people spent more and more time online.

So WeProtect Global Alliance exist to address this specific problem. We want to see a digital world that is designed to protect children from sexual exploitation and abuse online. So, who we are? We are a membership organization and what we do is we work by bringing together key stakeholders who all work together to solve this issue. So, we work with 101 governments, we work with 65 private sector companies, including tech companies, which obviously play a big role here, 91 Civil Society Organization, intergovernmental organization such as the UN or Interpol. And all these people we basically work together to develop policies and solutions that will better protect kids online. We also work mainly around 4 areas, knowledge, empowerment, advocacy and collaboration. I don’t know when I work respectively on advocacy and knowledge. So obviously knowledge looks at gathering all the information and the latest understanding on the issues, while advocacy looks at making a case to decision maker for more money and more action. So basically, my job is to give Eleanor all the relevant facts and evidence so that she can go to meetings and boardrooms and make sure that her arguments are backed up by facts, in a nutshell.

John Robb
Your job is to collect the information and Eleanor, what is your job?

Eleanor Linsell
I go out and talk about all the information that Alex gets to collect. My job is to kind of raise the issue up the agenda in terms of public understanding, political understanding. We work very closely with governments, with policymakers, with other NGO’s, inspector sector, with tech companies. And so, my job is to kind of advocate on behalf of this course.

John Robb
And this advocacy is something that’s important now.

In the early days of the Internet, child exploitation and abuse were primarily carried out through chat rooms and message boards. However, with the rise of social media and messaging apps, predators have found new ways to contact and exploit children. The efforts to tackle these problems ongoing. However, the issue remains a major concern. Can you give us some sense of what exactly online child exploitation and abuse is and how it is changed over time with the Internet?

Eleanor Linsell
Yeah, of course.

Child sexual abuse in all of its forms, it’s not a new crime unfortunately, is a serious violation of the fundamental rights of children and young people, and it can often result in long term psychological, physical or social harm for the children and young people who survived the crime.

Just going back to the very basics and then I’ll build up into the kind of technological. Child sexual abuse is the involvement of a child, so that’s anyone under 18, in sexual activity that cannot fully comprehend, or they’re unable to give consent, or they’re not development developmentally prepared enough to give consent.

And so that’s the definition of child sexual abuse for us. And then exploitation is a form of abuse that involves any actual attempted abuse or position of vulnerability, a differentiation of power or trust. So that’s how we see the crime itself and then building on that, when it comes to online forms of abuse, there’s lots of different categories and it’s also a constantly evolving thing. But the main ones are producing, viewing and sharing child sexual abuse material.

We also have online grooming, which is a very big problem here because social media platforms provide those avenues of connection between adults and children. And then you also have the live streaming of child sexual exploitation and abuse, which is incredibly difficult to tackle as it happens in in real time. And then one of the other things that Alex can potentially talk about later is so-called self-generated sexual material, which can either be coerced by people who are grooming children and young people, or it can be produced in a very consensual age-appropriate relationship, and it’s a lot more complex to tackle. So, that’s how we see the kind of main buckets and categories when it comes to online child sexual abuse.

John Robb
It is definitely a complicated problem to solve, and I’m sure we’ll get to some of those complications. One of the things that I want to talk about today is, this is a global issue, and it seems to have been on the rise, was mentioned earlier about the COVID pandemic playing a significant role, children spending more time online, predators have greater opportunities to groom and exploit. According to a report by the UK’s Internet Watch Foundation, there was a 77% increase in the amount of child sexual abuse material detected globally in 2020 compared to the previous year. This is a staggering number. Are you seeing similar data at WeProtect Global Alliance.

Alessandra Tranquilli
Oh, absolutely. In fact, the Internet Watch Foundation recently published there, the annual report for 2022 and in there they say that they’ve assessed a web page every 1 1/2 minute and every two-minute that Web page showed a child being sexually abused. There’s some really scary stats there. The National Center for Missing and Exploited Children also reported that in 2022, 99.5% of the report received by their CyberTipline regards incident of suspected child sexual abuse material. So that’s 99.5%. That’s almost all of them. So. And they also report that they have escalated 49,000 reports to police involving children considered in imminent danger. So yes, it’s I think all the data that we have at the moment really kind of highlights that the scale of this problem is huge and that it’s increasing year by year.

We are also seeing at the alliance, so what we do is we develop every other year a Global Threat Assessment and what we are that kind of analyze the trends around these issues and what we’re seeing, what we saw in 2021 was a diversification of production methods. So not only self-generated sexual materials that we’ll talk about later, but also there’s an increase in commercial drivers for abuse.

So, for most individual committing child sexual abuse, they have kind of sexual gratification as their primary motivation. This said, there’s more and more evidence coming up of monetization of the content. And perhaps most worryingly, increasing in self-generated, kind of sexual images that are exchanged for payment, which is, you know, could be linked to the rise of subscriber platforms. So yes, it’s quite it’s a bit of a bleak picture, but something that we’re working to address.

John Robb
So we see a rise in the evidence suggesting that there’s both more material being produced and there’s now instead of just the self-gratification of the gratification piece, it’s moving into a commercial context and maybe some of these platforms are unaware that they’re enabling this? Are aware that they’re enabling this? As we see this increase, what is the alliance doing to identify and respond to the problem?

Alessandra Tranquilli
So we work in different buckets. So, one of the things that we do is, for example, we develop tools and frameworks that guide their response. So, one thing, one framework that I would like to mention here is the Model National Response which we developed with experts and government professionals. And that framework basically shows that the problem cannot be addressed in isolation, but a government that wants to tackle this needs to have a range of capability in place. So, for example, you not only need to work on your policies and legislation, but you also need to have a strong criminal justice system that has access to the right databases, that has management processes. Governments needs to have better support for victims and survivors through helplines, for examples, they need to work on shifting harmful social norms in society, through education programs, for example, or working more on prevention, rather obviously response as well, but a lot more needs to be done around prevention. We need to listen more to the voices of survivors and involve those with lived experience and developing the solution. So, as well as kind of work with private sectors and tech companies because obviously they have a very critical role to play here. And obviously, the vote more money to research. So these kinds of frameworks that we have kind of outlines really details all these different capabilities that governments need to have to tackle this issue sustainably. Obviously, I’m not going to go into too much into detail now, but it’s in on our website. So, I encourage listeners to go and check it out if you want a bit more of a granular understanding of it. I’m going to hand over to talk about the others stuff that we do.

Eleanor Linsell
Thank you, Alessandra. Yeah, in addition to the Model National Response, which I think is our kind of flagship step by step guide of how to tackle this in every area, like Alessandra said. We also have a Global Strategic Response. This is a borderless crime. The Internet is borderless and borderless crimes require borderless solutions. So, we also have a Global Framework that people can look at and work out how to tackle this at the international level.

It builds again on the same buckets Alessandra was talking about in the Model National Response tool. So, you know you’ve got the criminal justice element, you’ve got the policy and legislation, victims and survivors shifting societal norms, which is obviously that big, complex meta level problem we need to fix. Private sector involvement, industry and research, obviously. The other thing that I think is worth mentioning as well when it comes to what are we doing and what are we bringing to this, I always like to think that the superpower of the alliance is our ability to get everyone around the table. Everyone who’s got a role to play in tackling child sexual abuse, whether it’s tech companies, safety tech, governments, law enforcement, child safety organizations. Getting everyone around the table and by convening all these different stakeholders and facilitating the connections, providing a place for constructive conversations and sharing what’s you know, technically possible, what’s legally possible you know, breaking down very complex thematic discussions. It not only helps us to better understand the state of play and what’s happening with this threat, but where we should be putting our collective resources and what’s actually being done to make progress. And I think that’s actually quite motivating because you can often feel quite stuck with something that’s so socially big and it’s a very motivating place to be sometimes when you’re sharing best practice, not just also it’s all doom and gloom, but actually there are so many possibilities and solutions out there, and it’s about elevating those ones that are really good and effective, and driving a kind of big global response. And by sharing all of that information, I think we can get there as well. So that’s one of our roles is convening and amplifying what’s happening.

John Robb
Would it be fair to say that part of what the alliance does is act as a lens to focus the global attention on the need to deal with this issue?

Eleanor Linsell
I’d definitely say that we do. We it’s about prioritizing one online child safety is huge and this is very specific issue within that. And I think we’re really just an organizing convening force that can connect all the different players that don’t necessarily always see eye to eye, don’t always necessarily get around a table. And I think that’s where we bring everyone together and really focus on one specific element of child safety online and I think that’s really our niche within all of this. But obviously we’re very supportive of the greater debate around child safety online, but we really focusing on the specific issue because it requires a very nuanced and targeted and you know, specific response.

John Robb
You also mentioned something that is very much aligned with where we see things having been in the internet safety space for 23 years. We see it as a global issue. This isn’t specific to one country or another country, or it’s a global issue, but not all countries view it the same way. And we’re seeing some governments across the globe, including France, United Kingdom, Canada and others in various stages of implementing legislation that will require age verification for accessing online pornography.

With child sexual exploitation abuse online being a growing problem, what are your thoughts on this kind of legislation? And do you think this is something that should be implemented in order to protect children and ideally, make the Internet a safer space?

Eleanor Linsell
Yeah, it’s such an interesting one. I think that’s probably work in policy, to be honest. But we find ourselves in a very unique moment right now. Tech regulation has been on the up. It’s had this big moment, big explosion of legislation throughout the last few years. Legislators have tried to keep up with all the developments that are happening in the technological space. And it’s actually a very unique moment of opportunity right now in the in the policy world because legislation can be a real catalyst for change, especially when it’s done well in a robust and thoughtful manner.

In recent years, we’ve seen, Online Safety Acts in Australia and Singapore going through, here in Brussels we have the Digital Services Act, and we also have a new proposal to tackle child sexual abuse and exploitation online explicitly that’s going through the Council and the Parliament at the moment. In the UK, as you said, there’s the Online Safety Bill. What’s going through the House of Lords and kind of, slowly kind of getting there with a lot of these regulations. And in the even in the US, we’re seeing, you know, President Biden put a lot of emphasis on child safety online, in the state of the union address, there’s a lot of focus on the Earning Act, kids’ online safety, the debate around section 2.30. So right now, there’s a lot of noise. When it comes to age assurance specifically, we understand that this is an incredibly complex debate. We tend to talk about age assurance because age verification is a very, an element of age assurance, but there’s lots of different ways of doing it.

There’s a very strong argument to be said that just as children can’t walk into a nightclub or buy a liter of vodka or whatever. You need your driving license or ID card to do. The same protections could be implemented online. Age assurance, again, as I said is just one of the tools that is fundamental in creating digital products that are safe by design, and safety by design is a really big pillar in the response to child sexual abuse online.

You can also understand why, I don’t know if you read this week, but we just heard this week from the English Children’s Commissioner. They released a report on the impact of pornography on young people, and they’re seeing in the UK lots of violent porn acts being carried out between the under eighteens group, with children as young as eight affected. So, I think there is space for this debate to be had. And I think Alessandra also works on has done a lot of work on age of assurance technologies and probably can provide you a bit more on how that those technologies can be used to kind of verify and assure age on platforms.

John Robb
Before it comes to Alessandra on that, because I think that that is an important consideration, you mentioned a phrase that I think maybe is something we need to get into the vernacular, which is safety by design. So, when we’re building things, we build it with safety in mind. How would we do that? Not an easy answer. This is technology. There’s a lot of that played, certainly not an easy answer, but I think the notion of safety by design is an area of focus, especially for technology companies like us where we have the ability to build in some type of safety for the things that we’re doing or enabling. So, Alessandra, if you have experience with this age assurance, what would you like to contribute here?

Alessandra Tranquilli
Well, I think one of the kinds of controversial things about me just showing this, like, you know, a lot of people saying, well, you know, it’s kind of, what about privacy and so on and there are different ways of doing it. And for example, in the intelligence brief we published recently, and we’ve talked about AI based facial estimation, which for me is a good example where kind of the AI is trained with lots of different faces within a month and year of birth from all over the world.

So, when it sees a new faces, it does a pixel level analysis and issues an estimated age. So, to the technology the image is simply a pattern of pixels, and the pixels are numbers, so it doesn’t recognize anyone. It hasn’t been trained on name photos. It just learns that this pattern looks like a 16-year-old and this pattern looked like a 70-year-old. So, this makes for privacy friendly approach as it doesn’t require any personal details or ID documents. All the images are instantly deleted once someone received their estimated age and nothing is ever viewed by a human. So, you can have that kind of, you can use technology to, while maintaining privacy to actually support a safer Internet. And once you have kind of facial estimation once you know you’re dealing with a child, you can adopt different strategies to make sure that children have a safe experience. For example, not allowing geolocalization tracking, turning off aging appropriate advertising or profiling, turning off late night notification, disallowing children from being contacted by under 18. So, once you have that information, age verification and age estimation can be used to protect children and I think these are tools that are needs to be used, needs to be exploited.

John Robb
We need to find some way to raise the bar, so to speak, of access to this kind of material. The science, I think, is pretty clear that there are negative effects of exposure to this material before people are mature enough in order to understand what it is they’re looking at, and it could pose a challenge down the road. There’s no easy answers to any of this, and I don’t want to leave any room to think that we have all the answers. I don’t think anybody has all the answers. It’s a complicated problem and we’re just looking to find the ways that we can all contribute and make it better. You know when I say that we’re contributing. We’ve been blocking CSAM for a long, long time. We know that it’s a serious problem that affects many kids.

There’s long lasting consequences for the victims. Listening to those victims and understanding that problem, shining a light on the problem, I think is very important. That’s the work that your alliance is helping with.

We also see the CSAM material itself becoming parts of other forms of exploitation, grooming, trafficking. We do our best to try and block the CSAM material, find that and report that organizations including Internet Watch Foundation, Canadian Centre for Child Protection, we work with organizations like yourselves. WeProtect Global Alliance.

What can you do to help raise awareness? How? How do you think partnerships with NGO’s and other stakeholders help raise that awareness and implement those best practices when combating CSAM?

Eleanor Linsell
The bedrock of our alliance is that we’re multisectoral. So, our alliance is comprised of governments, tech companies, and NGOs, and children’s rights groups. And tackling child sexual exploitation and abuse is incredibly complex, as you say. I think it’s the nail on the head that no one has the silver bullets as it were. And each of the actors within our alliance holds the key to the puzzle. So, we can’t solve this without groundbreaking technology, we can’t solve this without regulation, we can’t solve this without well-resourced law enforcement, we can’t solve this problem without specialist, and we definitely can’t solve this without the voices of survivors. When it comes to NGOs specifically, and civil societies specifically, governments and technology companies, they can’t tackle this. The attention won’t be there unless it’s without society wide support. And I think we’re starting to see, you know, this is a topic that people don’t like talking about.

And it’s very hard to get it up the agenda and we see that, you know, civil society plays a vital role in kind of giving space to the topic and amplifying something that’s, you know, when you talk to people about it, they care, but they don’t want to talk about it. And so, some of these NGOs are providing vital support to victims and survivors, some are advocating for better legislation, others have creating tools or processes that can help track illegal and abusive material. And their experiences, the knowledge and insights that NGOs and civil society are providing are of paramount importance in terms of raising awareness, informing the discussion, because when it’s a topic no one likes to talk about, creating and adding information, and adding value, and research, and insights into this, is absolutely vital. And then also amplifying the experiences, and the opinions, and the feelings of young people and survivors in this is also, all down to NGOs and civil society, and I think they’ve done an amazing job of making sure that this topic gets up the political agenda, the social agenda, the public agenda over the last, you know, 20 years. This is not something that we’d be, we’d be doing our podcast, well there was no podcast 20 years ago, but this is not something that we’d be discussing this openly 20 years ago. And I think that a lot of that is down to the amazing amplification efforts of civil society in this space.

John Robb
I think that, for me personally, the fact that people don’t want to talk about it is one of the challenges because how do you get people to recognize the scale and scope of the problem if nobody wants to talk about it? One of the things that we did was we included the WeProtect Global Alliance Transparency Report into our default reports and our product so that our telco customers have easy access to a transparency report that they can run on our system, even if they just run it internally, so they can see what’s happening in their network because we believe that raising the awareness of the issue will lead to better solutions. If nobody’s talking about it, nobody sees that it’s a problem, how do you get them to work on it? But if you point out, OK, you don’t think it’s an issue here? Let me show you the stats and Alessandra, I mean, I’m sure you see lots of stats and those kinds of things. So, as this awareness goes up when we work together to develop better strategies and tools, detection, response definitely requires a collaborative approach. As you say, involving law enforcement, policymakers, technology companies, and NGO stakeholders, the whole gamut. What are some of the signs that the alliance is making a difference? How do you measure that?

Alessandra Tranquilli
Well, first of all, to measure impact. So, the impact of an intervention or something takes a long time to measure. So, if you want to do a more scientific kind of you know, and there’s no control groups as you know because the technology is all over the place. So for me personally, what motivates me to, you know, go to work every day and where I see that the impact that my work is doing is when I see, you know, the Model National Response being used by governments for example. Just a couple of months ago we did a webinar with the Kenyan government who kind of explained to other African countries how they’ve been using the Model National Response to build their own kind of national plan of action to tackle online child sexual exploitation and abuse. We also see that the regional plan of action for the Southeast Asian region has drawn on the Model National Response. So, for me, it’s really when I see the tools that I’m building, that we are building together, as an alliance being used by governments, this is something that is a show that we are kind of moving in the right direction. Same as when I see experts’ kind of drawing their data from our Global Threat Assessment means, OK, the data that we’re producing is being used. So, it’s reaching the right ears. So again, this is this is a for me a case in point that what we’re doing is having the desired impact. Again, it’s a new, it’s a new area. So, you know before we can measure on a scientific level, the impact over the years it will take some time, but for sure, the alliance we’re also working on at the moment. We’re building an impact framework because we really do recognize that we need to be able to show impacts to get more money, right? That’s also one of the reasons that we need more funding, to make sure that the work continues on this, and that the light is shed on this, that people don’t shy away from talking about it basically.

John Robb
Yeah, it’s definitely a tough area. You know, there’s work that has to be done. It’s sometimes hard to measure, especially over the long term. But as you say, when you see things being implemented, your Model Response, the people that you’re talking to, the government’s using this information, the stats being used in other areas, you see that it’s raising that awareness and making it harder for people to ignore the problem.

Which I think is maybe part of what it is that we’re trying to achieve. Make it hard for people to ignore this problem, because if you can’t ignore it anymore, then we’re going to get something done about it. I think the old adage is the squeaky wheel gets the oil. Maybe that’s not appropriate here, but I think that that’s, the idea is that let’s shake the tree and let people know this is a real, ongoing issue because it is kind of hidden, right? It’s something that’s done in places that you don’t see. That’s hard for you to identify, especially when it’s done online, it’s really hard to see. So, I think that this is something that we need to shed the light on and we’re happy to participate in that approach.

Alessandra Tranquilli
One of the things also that I think it’s very important to shed a light on is, all the aspect around self-generated material. It is a very complex issue, there’s different types and even the term in itself is hotly debated at the moment. Can we really talk about self-generated materials when some of it is coerced? There’s different categories, but obviously the one of the complexities is like, some of it is kind of, you know, nonsexual materials that is self-generated, but then it’s misappropriated. There’s like voluntary self-generated materials that it’s between adolescent peers that is exchanged in a consensual manner and then the harm here is when, you know, an image is reshared against another person’s wish, and then there’s obviously the course self-generated with people being groomed and the materials kind of circulated definitely against anyone’s wishes. But the complexities here is that you cannot put them all in the same bucket and kind of treat them equally because they’re very different things. For example, you don’t want to criminalize teenagers for exploring their sexuality and you know, sending sexual images of each other when it’s consensual. But you do want to, these teenagers to be informed of the risk associated with this behavior. So just to flag that we soon going to be published a report that kind of looks that has interviewed more than 200 kids aged 13 to 17 from Ghana, Thailand and Ireland and that kind of asked them what they understand, self-generated material to be and what they think should be done to improve their response.

I think it’s very important to also, to listen to what kids and young people want to see happening around that and involve them. So just flagging that this is coming up and if you are, if the listeners are more interested in reading about it, I’ll encourage to keep an eye out for it.

John Robb
We’re definitely going to come back to how people can help and where resources are available to them. But speaking directly to teenagers, I have two daughters. They’ve passed their teenage years now. It’s a minefield for teenagers to navigate all of those issues, you know, moving from childhood to adulthood, that in between stage is tough and finding a way, yeah, it’s a challenge. I don’t have any answers on that at all. As a parent, I know that it’s not easy on the outside looking in, and it’s equally as hard or more so for the teenager looking out, so finding a way to deal with this is, yeah, it’s tough.

Sometimes it just boils down to it’s nuanced, right? There’s no easy answer. We have to look at all of the answers, we have to look at all of the levers we can influence and see where we net out. So, before we get to closing remarks, how do people find the alliance and what can they do to help?

Eleanor Linsell
Well, we have a wonderful website, so if you just type into your preferred search engine, WeProtect Global Alliance, we’re very easy to find. WeProtect is all one word so we’re right at the top there. And to get involved, we’re open to a lot of members. So, if you’re listening to this podcast and you, you’re a tech company, or a startup, or you’re working in the children’s protection space, we’re always expanding the alliance and we’ve got an open-door policy to any new members. So, you can find us there.

John Robb
Excellent. And what would you like to leave with the people listening? What thought would you like to leave them with as we conclude here today?

Alessandra Tranquilli
It’s important to recognize that technology is here to stay, and I think there’s a lot of good things that are happening thanks to technology. So, I think kind of resisting it or seeing it as an evil thing is really counterproductive. So, I think we really need to make sure that we and our children make informed decision to use technology intelligently and to do that, we need to have, I think I would like to leave saying that, we really need to work for governments and organizations to give more funding to and human resource, financial resource, and human resource for research and data so that the intervention that are done are based on evidence. And obviously there’s a lot of technology that goes extremely fast, so you really need to keep on top of the new trends. So obviously, a lot of effort needs to be allocated there, I think.

Eleanor Linsell
Yeah, I would second all of that. I mean, I think I’ve said this about 500 times in the last week, but it isn’t an easy topic. And I think everyone who’s made it to the end of this podcast, some not like given up because of all the data and the nature of the topic is part of the solution. People are not wanting to fathom this in the real world. I think talking about this in forums like this is a great start. John, you mentioned that it’s hard to think about it when it comes to your own family, and I just really encourage everyone to have open conversations on this topic. It’s hard just creating a space of trust and non-judgment is really important in all of this, and really just understanding that it’s out there, but that you know there’s a supportive environment I think is a huge part of the solution. And just having open conversations about this and you know, and broader conversations, is always really welcomed. On that note, I’d actually really like to thank Netsweeper for opening this subject up to your audience and it’s great to see that you’re engaged in this, and we’re really appreciative that you’ve given so much time to actually elevating this issue up your up your podcast agenda and that the solution lies in open collaboration across sectors.

We can’t do this without all the pieces of the puzzle, without all the bits of the machine, and we’re really looking, I’m actually quite optimistic that we’re getting to a place of organization, and we’re moving forward and that something can be done about this. We just really need to focus our energy and our resources into getting it done.

John Robb
Well, thank you very much for both attending. We really appreciate it. It is an important issue and we do want to amplify this issue and bring it to the four so that we can have those open communications with their own families, with their own communities, with our own governments, and as global citizens to raise the awareness of the importance of this issue. So, on behalf of Netsweper, thank you very much for taking the time. Really appreciate it and we will include links in the description for this podcast to the WeProtect Global Alliance website and you be able to find that nice and easily, and be able to contribute if that’s something that you can do. So, thanks once again really appreciate it.

Eleanor Linsell
Thank you.