
How is AI changing the scale and scope of online enticement?
Rob Chitham & Dr Shaunagh Downing
29 January, 2026
Rob Chitham

Investigating crimes against children, including those related to child sexual abuse material (CSAM), can have profound effects on law enforcement officers and staff. In fact, research by the University of Portsmouth, University of Southampton, and the International Policing and Public Protection Research Institute (IPPPRI) found that 27.7% of the CSAM investigators they studied may qualify for ‘major depression’.
Recently, we asked Engagement Manager and former Detective Inspector Rob Chitham to share his perspective, based on more than 30 years of law enforcement service. Here, he discusses why analysing harmful material is such an important part of the role, the psychological effects it may have, and how tech innovation could pave a way forward.
Analysing seized media files to identify child sexual abuse material is an incredibly important part of investigating crimes against children. It’s what enables investigators to determine whether the material they have in their possession constitutes as illegal, and therefore whether they’ve identified a crime.
In cases where investigators do detect illegal material, they can then grade the severity of the images (from Category A-C) and write a description of media. They can also gather intelligence around the type of crime committed. For instance, is the perpetrator responsible for possessing or distributing indecent material or is there evidence to suggest they’ve committed contact abuse?
All of this evidence helps to build a case and ultimately hold the perpetrators responsible for the crimes they’ve committed.
It’s hard to tell because everyone deals with trauma differently. You might argue that someone experienced in grading CSAM is more psychologically prepared than a uniformed officer who doesn’t expect to seize indecent imagery – but this isn’t always the case. Investigators may also prepare themselves to analyse media featuring children, but will then be unexpectedly exposed to another form of cruelty or type of victim. This shock can be very upsetting, no matter how experienced you are.
As someone who has spent years investigating CSAM cases, I also know that reminders of your home life can be disturbing and damaging. You might view something that you didn’t necessarily think would affect you, but which you related to closely. For instance, investigators might need to read conversations between adults and children who are a similar age to someone they love. These ties to your personal life can make it difficult to distance yourself from what you’re seeing.
My view is that exposure to harmful material will affect absolutely everyone over time. It’s just a question of how it manifests itself and how quickly. It’s only now that I look back over 30 years of law enforcement service that I realise how this part of the role affected me.
It can be deeply frustrating trying to stay on top of new perpetrator tactics in law enforcement, and of course this includes the exploitation of technologies like AI. Now that perpetrators are using AI to generate realistic abuse material, there can be a lot of uncertainty as to whether you’re analysing images of a real child or not.
There can be a nagging doubt in the back of your mind: is this a fictional child? Am I prioritising my caseloads correctly? Have I spent time trying to protect a child who doesn’t even exist, when there are real children out there who need my focus? These questions can certainly add to the stress of the role.
Luckily, we’re seeing better provisions being made for officers’ mental health and wellbeing compared to when I first entered the service. Now there are solutions like screen pop-ups that remind you to take a break from grading, and even computer games that are designed to take your mind off what you’ve been doing.
But of course, huge workloads mean that on many occasions, officers will switch off these reminders and carry on with the task at hand. There’s a lot of work and a lot of pressure, meaning that realistically, these provisions aren’t adhered to nearly as often as they should be.
My view is that investigators need a robust supervisory system making sure they take a break, but that the real solution lies in technology.
The silver bullet will be using AI tools to grade the severity of indecent images, and I’ve no doubt that will come. This ought to be a huge priority in my view, as it’ll help us take away the need for officers to look at harmful images and videos in the first place.
Of course, developing these tools is going to take time. Determining whether an image has hit the threshold for Category A, B or C can be challenging; it’s not a simple concept that AI can automate just yet. But I do believe it’s coming, and I think it’ll be incredibly helpful.
After all, there are always ways to limit exposure, but it doesn’t matter how many things you put in place, how many signs, how many games, how many breakout areas. You’re never going to really have an enormous impact until you negate the need for in-depth viewing. To me, that’s what we need to be striving for.
Working to protect children can also be an intensely satisfying job. It can be challenging, but you are really making a difference. You can go out and you can save a child. That changes the child’s life forever.
On a personal note, I think the resilience and professionalism of the officers who help to protect children against abuse and exploitation is quite remarkable. I travel around the country delivering presentations to law enforcement agencies, and one of the reasons I like doing this is that I meet an incredible bunch of people.
These officers really are exceptional, they really care, and they’re doing the right thing. That’s very special.
At CameraForensics, we’re committed to empowering investigators to combat crimes against children. A core part of this is developing tools and capabilities that help to accelerate investigations and reduce burden upon law enforcement officers. We look forward to sharing more about this when we can.
In the meantime, you might find value in our newsletter, The Source. Each month, we’ll send you insights from our team of experts around the latest threats to children’s safety, law enforcement challenges, and how tech innovation can help.