Many students struggle to manage their feelings of stress, anxiety, and depression. These mental health challenges can affect students’ behaviour and academic outcomes. Without proper monitoring and intervention, students can be left vulnerable to cyberbullying, violence, self-harm, and even suicide.

Student safeguarding is essential to their development and learning. Netsweeper’s onGuard solution provides tools to identify and quickly intervene when students are at risk.

On our most recent episode of Inside the Sweeps, Nick Finch from Co-op Academies Trust discusses the challenges their trusts was facing, and how Netsweeper’s onGuard safeguarding solution has helped them overcome them.

[John Robb]
Hi and thank you for joining us on our Inside the Sweeps podcast. My name is John Robb, I’m the vice president of marketing at Netsweeper and joining us today is Nick Finch of Co-op Academies Trust. Nick, can you give us a little background on yourself and tell us about your role as regional lead with the responsibility for safeguarding? 

[Nick Finch]
Yeah, sure, hi, and thanks for having me on. Great to be here. So yeah, I’ve got 18 years experience working in the education sector as a teacher and safeguarding lead. Currently one of the regional leads for Co-op Academies Trust. I work alongside two other regional based colleagues and we support all of our DSLs and head teachers with all safeguarding matters across the trust. So currently we are 30 academies, but we’re growing, and we offer support advice to our academies and DSLs on how to quite simply keep pupils who are in our care safe. 

[John Robb]
Well, I can imagine if you’ve been involved in this for over 18 years, you have seen a change and a recognition that managing mental health or helping with mental health is an important consideration, especially as you shift in technology and more of their day-to-day is happening in a virtual or computer environment. So how what kind of problems has Coop Academies Trust identified and are trying to solve using a safeguarding solution like onGuard? 

[Nick Finch]
Sure. I mean, you know, COVID locked down has accelerated everything. But really, learning from home and accessing learning materials for homework and for assignments, there’s been an upward trend in schools for a significant length of time and remote learning is fantastic, but it also presents significant safeguarding challenges. So, it’s stuff like how do we know who’s accessing a device? A school device particularly. How do we know they’re accessing appropriate material? And you touched on mental health there and that’s something that most schools and teachers and parents worry about. 

Especially with social media that can spread, worry and negative views probably more quickly than it’s that spreads, positive ones and positive experiences. So as a trust we need this solution that could provide us with a really fast and efficient way of monitoring our pupil devices. So, it couldn’t be intrusive to the people experience because we don’t want to slow them down with their learning. But it would alert us to potential safeguarding concerns and in that way our trained DSLs and support staff can offer support and protection in cases where it was needed. 

We needed the system that was 24/7 and 365. We needed a system that could be configured to our needs, and the workflows that we have internally. We needed a system that could weed out false positives and so we just end up with a kernel of verified safeguarding concerns or potential safeguarding concerns, so that our DSLs can review those and they’re not spending unnecessary time sorting out real problems from false positives. 

And lastly, we needed a solution that gave us human monitoring as well and human alerting because AI is great, it’s amazing and it reduces our workloads significantly. But, we don’t want to rely on it completely. We want to have human intervention to make those critical calls and especially if it comes to potentially immediate safeguarding concerns. 

[John Robb]
So we’re going to come back to AI and human review a little bit later, because I think it’s an important consideration as you’re looking to scale. You’re currently at 30 schools. As you scale up, the problem grows exponentially as you added more schools, so we’ll come to that a little bit later. 

During your evaluation, you established a set of criteria. These are the things that we need it to do: human review, ease of use, dashboard, human review and as a result, you ended up selecting Netsweeper. 

For that, did you look at other solutions and you just found that there was some gaps there, or was it the fact that there might be some customization required? What were the driving factors that directed you towards Netsweeper versus another solution? 

[Nick Finch]
Yeah, sure. I mean, you’re right. I think it was easy to customize, or at least you know sort of get the processes, the system, to fit our processes as closely as possible. So, it wasn’t purely something that was just off the shelf. From our point of view, with our safeguarding had some from looking out from our DSLs perspective, because we wanted their workflows to be as simple as possible because we want to facilitate them protecting our children rather than put any barriers in the way. So, we had to have a system that was able to be effective and we wanted the experience for DSLs to be to be intuitive and easy to navigate. And that was something that from day one was really clear that onGuard you know, could do. The dashboard is really simple to understand, but actually at the same time, just through a few clicks the button, there’s a whole wealth information that you can you can drill down into. Things like logging in, it’s really simple. And we’re a Google environment, so our DSLs can simply log in by one click of a button. 

And also, you know, we have a range of devices that our pupils use, whether that be workstations, tablets, Chromebooks, laptops and the working across different operating systems and applications. So, we needed a complete solution that could go across multiple platforms that was easy to use and onGuard gave us that. 

[John Robb] 

I know that in a previous conversation with one of your colleagues, Jim Fessey, he talked about that partnership and how important that was for the ongoing improvement for your solution, because it even from the very beginning, your idea was how do we make this work in our environment? So, you’re going to take an off the shelf product and then you have to make it work in your environment based on your unique needs. And because we were a partner on this project, we were able then to help guide some of those integrations to make it more meaningful in your implementation. Now something you just said there which I thought was interesting was in the beginning, when you would get an alert, there was perhaps a notion of is it really an alert? And do we need to go and do with it? That sort of a function of an AI based alert where you’re not always sure, is this really a genuine alert? When you add in the human review now, you’re at a point where you’re saying “if we get it, we need to act on it.” That’s what I’m seeing as I listen to explain this to me, do you want to expand on that a little bit and how you’re perceiving that? 

[Nick Finch]
Yeah, definitely. I think working with the human review team has been crucial. You know, in terms both in terms of getting you know reduction of those false positives, it’s using the intelligence that we have and that they have to understand what’s positive and what’s not. But yeah, we’ve moved on a stage from that now essentially so, the essential ingredient for us, I can mention this again a bit later on as well, but having the human review team is that it’s great to have them there because we know that anything that’s potentially an immediate safeguarding concern is picked up straight away. Our DSLs and support staff are really busy, as we all are in schools. DSLs are rarely at their desk because they tend to be dealing with instance and supporting children and families. So, it is possible that an immediate safeguarding concern that comes through an alert is not necessarily picked up straight away if they’re not at their work station. 

But actually, with the with the human review team now, that is something that can’t happen. You guys have got our cell phone numbers, you got the DSL contact numbers, emails, you even got my you know my details, and my regional colleagues’ details. And you’ll call and call and keep calling until you get a response, so it’s moved on from that being able to train the AI so to speak to this invaluable service whereby actually if something if we need to know something and it’s a real red flag, we are going to get that message within minutes. 

Certainly within an hour or so, and we can take action. So, it’s literally helping to protect vulnerable children. And I can think of a couple of cases that I’m personally aware of and I know it’ll be replicated in other reasons as well, where the pupils, we have these sorts of alerts and the human review team has given us a call about it. And actually, yeah, these individuals weren’t on our radar particularly and they weren’t known to support services. So, we’ve got that demonstrable evidence of safeguarding and action and even more exciting, it’s been a as a preventive stage. You know, we’ve stopped something from happening as opposed to having to react to it. Yeah. After that has taken place. 

[John Robb]
So there’s a couple of things there that we might want to talk about is one, you’re seeing the human review not just as the review part, but as an extension of your safeguarding team to give you that added layer of emphasis. If somebody isn’t at their machine, they’re going to get a phone call. If it’s a really high priority and that’s helping you address those situations faster. 

And you mentioned early detection early prevention, that’s part of that solution. I would imagine that if you can get to them sooner, it’s before the crisis happens, which is much harder to deal with. You want to know when it’s a bullying concerned say and then you can address it as OK, hang on, maybe we need to find different language or however you manage that. But that’s what I’m hearing you say, is how you’re incorporating this into your day-to-day. Do I have that about right? 

[Nick Finch]
Yeah. See, I think it’s about having that, as you say, having that service having that relationship with  the human review team, you know knowing and relying, not relying them as such, but knowing that if that call comes through then you know we need to act on that and that’s invaluable to know it’s there and that it works. And I think in conjunction with that as well it’s you’re having the different categories and the priorities you know 1, 2, 3 of those categories. DSLs can plan their day and their week around responding to those alerts because obviously we’d like to respond to every single alert immediately. That’s not practical in reality. So, what DSLs can do is respond to the most immediate concerns with urgency and then schedule their day or their week to go through and action or delegate the other tasks as well. 

[John Robb]
Prioritization is obviously key. Some alerts are more important or more timely, maybe not more important, but more timely. There’s more, more or less time to take action on a given alert based on its priority level, and doing this early detection, early prevention, part of your process is to engage with the relevant people around the pupil, whether it’s teachers, staff or their parents. 

Have you been engaged with parents and had some feedback on them? And what has that response been like? 

[Nick Finch]
Yeah, absolutely. So, I can think about a couple of examples which I touched on before. So, they were immediate safeguarding concerns, they were related to mental health or poor mental health, potential suicide, or suicidal ideation. And this was with regards to a couple of pupils who actually hadn’t met any of our officials previously. And but were going through a tough time and were there for, you know, looking for information on the Internet, support services or the searches which in some way alerted us to the fact there could be a concern there. So that was followed up in the correct way. 

And indeed, there was an underlying concern and reasons for that. So, to answer your question in terms of working with parents or notifying parents that was really, really crucial for us because, you know clearly we spoke to the child first, but then parent needs to know, so straight away, as soon as possible, parents were alerted and their reaction has been why we do this job. They wouldn’t have known either. There was so appreciative of that support that they received and the information they received. 

The two academies involved work with those children, those families to put support in place. So, that’s been a happy ending for both, and I don’t think you can really sort of put a better perspective on things than that really. So yeah, the parents have just been over the moon, as have we that the systems worked. 

[John Robb]
That’s why we do this right? Is we’re doing this in order to make sure that everybody gets the education they need and gets the help that they need. And here’s a concrete example where there was some issues, it was clearly identified, the proper people were engaged, and we went from a potential crisis to receiving the help that the people needs. The parents are included in part of that process. And you touched on something there very briefly that I think is important to remember. 

So much of the day-to-day for pupils now is online and online is essentially quiet. You don’t hear it right when kids are out on the playground and they’re bullying each other, you can see it, you can hear it, it’s easy to identify, and you can do something about it. When those behaviors happen online, it’s really hard to identify that. And you said the parents didn’t know. Not because they weren’t being good parents, that stuff happens in a place that they don’t see. 

[Nick Finch]
Absolutely. Yeah, and for a reason, because the children themselves didn’t want the parents to know. They weren’t, you know, in a good place and weren’t necessarily thinking in the best way that to keep themselves safe. And that’s no criticism of those pupils or the parents either as you say. But yeah. These things, online behaviour can go unnoticed, can’t it? And that’s sometimes the attraction of it. I think the other thing that’s really interesting to point out is that with online activity that can be monitored and tracked, you particularly using something like onGuard dashboard it brings out themes for us that that we would possibly have probably have known about, but you know we can actually put numbers to it. So, I’m thinking of it of a of a recent trend, it’s probably a global trend, but it’s certainly something that’s trending in the UK recently, which is around misogyny and various misogynistic websites and influencers, etc., who may be around. And that is something which onGuard is picking up when pupils perhaps signed up to an e-mail newsletter, we wouldn’t necessarily know they would have done otherwise. 

Or they’re searching certain types of information related to this activity on on the Internet, and you know, and very often it can be out of academic curiosity as opposed to anything untoward. But nonetheless it gives us a picture of actually how widespread these things can be. And so, in the past, where we might have known and suspected it was going on, what we could actually see is how prevalent it is. And therefore, the academies that affected by it the most can put things in place to support their pupils and put education programs in place to think about these things in the right way and that’s been really, really interesting and really, really useful. 

[John Robb]
Something I hear you talking about is the fact that there are sometimes online behaviors that in isolation, may not be perceived as being something that you need to deal with in a broader sense. A visit, or a view, or an activity is different than a series of visits views spread across multiple people that lead you to believe “Ah, there is something happening here.” A trend, as you might call it.  

And you identified earlier misogyny is something that there’s something happening you’re not quite sure exactly what it is, but the dashboard is allowing you now to see some of this in a broader sense that while you’re not necessarily going to take action on an individual pupil, you may choose messaging inside of your school or academies to address that in a way that’s more helpful and meaningful. 

[Nick Finch]
Yeah, absolutely. As you say, it’s about if you’re noticing that there’s a pattern there or you can notice there’s a pattern there, then it’s about doing something about that. You know it’s efficient use of time then as well as it because you’re not, I’m going to use the phrase, “we’re not going to use a sledgehammer to crack a nut.” So, there might be certain groups of pupils who actually, a conversation with them is what’s needed. You know, there might be other aspects or the other situations where, you know, we need to put something that’s more into  the curriculum to discuss these sorts of matters. Yeah. There could be certain examples where actually it’s even more serious. So, we actually need to think about other support mechanisms to intervene in this way and actually again that’s where onGuard is becoming really useful because the information is contained on an alert in terms of the contextual information that comes through from the actual web page or from the screen capture allows you to actually make an informed decision about what sort of search has this been. So, is it a relatively level concern we can deal with it in a certain way or maybe as a group situation or a whole school situation, is it something that needs a bit more intensive support? so, yeah, it’s been really useful for that as well. 

[John Robb]
I’m very happy to hear that it’s being useful. That’s why we make the solution is to have that positive impact. Overall, is there anything that you want to say about your experience with onGuard that you think somebody listening to this would appreciate hearing. 

[Nick Finch]
Yeah. I mean I guess a couple of things, well more than a couple. So, with my own regional lead trust perspective, I like the ability that I can see all of my onGuard data across the trust in one dashboard. So, I’ve got oversight of all those academies, if I want to have it more importantly than that, I think on a daily basis I can actually offer support, myself and my colleagues offer support to DSLs, as we’re seeing the same information as they have. So that facilitates working together so much more. 

We can even provide cover for each other. So, if it’s the school holidays, for example, someone like myself, regional lead, which would tend to be the point of contact for onGuard for the human review team for example. That’s been really useful. 

And from a teacher’s perspective, I mean, you know, teachers have a duty of care and responsibility to safeguard their pupil. That stands to reason. But teachers want to teach. They want to impart knowledge and learning on their pupils from start of the day to when they go home in the afternoon. And I think knowing that onGuard is in place, you know, along with the associated web filtering platform that we use, it means actually the teachers have got one less thing to worry about. So, they don’t have to worry about as much about child protection.

They know that onGuard’s got their backs, the DSLs got their backs, and actually they can concentrate on what they’re there to do and want to do, which is which is to teach. And I think that’s really important. 

From a trust perspective as well, I think it’s been a pleasure to work with onGuard from day one. You’ve been entirely responsive to our needs, and that’s from scoping out initially what we wanted, through the implementation side of things, and then so the business as usual model that we’ve got now. 

Technical Support has always been on hand. Yeah, I’m glad to say we’ve not needed Technical Support too often, but when they have you have been needed, they’ve been really, really quick to respond and possibly the friendliest bunch of people in North America and Europe to deal with. And same for the human review team – dedicated, approachable, you know really professional. Some of the guys there support the wrong soccer teams, but we’ll gloss over that, it’s just been really good to deal with and really positive experience. 

[John Robb]
As we wrap up, this information has been super helpful. Really appreciate it. Thank you very much. Is there anything like a recommendation, like a single kind of thing that you would say to somebody that’s considering onGuard that they should be thinking about? 

[Nick Finch]
I think it’s about what you know, I sit in the works for your Academy, for your trust. What are those needs? I think for us we need something that was dependable, and you know robust and on top of that, this has been essential, we’ve talked about a few times today, having that human review team in place to really you know get the system working, working for us, you’re giving us that confidence that the alerts coming through are, you know we need to deal with those. And then obviously in terms of that, get the most serious potential alerts, you’re getting those phone calls and getting people to take actions straight away has been amazing. There’s at least two, but there are plenty more examples where we’ve actually made a real, demonstrable impact on children safeguarding, children safety, and that’s what it’s all about. So, get a system in place that does those things simply, robustly, and then that’s exactly what onGuard does. 

[John Robb]
Well, that’s really great and I will add in something that we learned in another conversation we had around mental health and that’s when people do find themselves in those situations, there’s frequently an invitation to help. And onGuard provides access to that invitation visa v the visit to a certain website or some type of a thing that they’re writing online. And that becomes that invitation to help. And you then take over, your team takes over and gets the pupil the help that they need. 

And that’s definitely a good thing. I’m glad that the Coop Academies Trust has people like you on board to help their pupils get the help that they need because we know that it’s not always easy and finding a way to make it easier is probably a good thing.  

So, thank you very much Nick! I really appreciate it and let’s keep helping the pupils have a better chance at a good education. 

[Nick Finch]
Absolutely, I greatly appreciate your time and yeah, thanks for having me. 

[John Robb]
Cheers. 

[Nick Finch]
Cheers.