
Child Safety Online Report
CameraForensics
11 March, 2025
Freddie Lichtenstein
Any tool can become outdated, and image forensics tools are no exception. As digital environments grow in scope and scale, so must the tools for investigators safeguarding children.
This is particularly important for law enforcement agencies (LEAs) using image forensics to investigate the generation and spread of child sexual abuse material (CSAM).
To help investigators adapt to the changing landscape of online exploitation, technology providers need to stay abreast of the techniques that offenders are using, and the challenges that LEAs are facing.
Offenders continue to use emerging techniques to target and exploit children, meaning that new tools for image forensics are needed. To remain effective, companies like us must continually integrate new tech.
For example, according to a 2024 report from Protect Children, CSAM offenders now use a wide range of social media sites, online gaming platforms, and messaging apps to contact victims. These vary from well-known platforms like Facebook and Snapchat, to the lesser known Wickr Me, Discord, and Viber.
With this proliferation of digital environments, investigators need tools that allow them to gather intelligence and follow leads. For instance, tools that allow them to gather usernames across sites, and uncover new intelligence about an offender. Or, tools that allow them to find duplicates of known CSAM, instead of manually searching through hundreds of digital platforms to issue takedown notices.
Image forensics tools that don’t respond to new use cases are at risk of becoming obsolete, while those that adapt to the changing threat landscape will continue to add value to the investigator workflow.
For a real-world example of this in action, read Supporting investigators with an agile digital forensics tool. Here, we share how we built a tool to crawl websites and detect illicit imagery, based on investigators’ changing needs.
It’s not just the emergence of new technology that poses challenges for image forensics investigators. It’s also the speed at which it is developed and adopted by both victims and offenders. Image forensics platforms must be able to identify and respond to these threats as quickly as possible, so that investigators can be proactive as offenders figure out new ways to exploit them.
Historically, the required changes were simple. For example, imagine that a new camera model was released, and investigators needed to be able to capture its serial number. If the new model wrote files in a different format, image forensics platforms could update their applications to support the new format, and push it live with the next planned release. Today, however, technology is moving so quickly that this approach is no longer adequate.
The use of artificial intelligence (AI) to generate CSAM is a particularly prominent example here. It has taken less than a decade for AI-generated CSAM to evolve from crude attempts to images that are nearly indistinguishable from real abuse material. What’s more, users can locally run some AI models – such as Stable Diffusion 3.5 – and train them on their own datasets. This means that investigators aren’t just keeping up with updates from companies, but individual users too.
Image forensics platforms have a real opportunity to support investigators as they navigate the challenges of AI. This may be by developing AI detection capabilities, for instance, to help them identify where images have been modified.
To continue learning about the risks of AI-generated CSAM, you can find our guide for investigators here.
Although technological advancements can pose new threats, they also create opportunities for LEAs to safeguard victims.
Generative AI is a great example of this. In the hands of offenders, AI may be used to generate harmful imagery. In the hands of investigators, however, AI is used to analyse large datasets quickly to free up resource for more nuanced tasks. By identifying use cases like these, image forensics tools can help equip investigators with the functionalities they need to stay ahead of offenders, and safeguard more victims.
This may be by adding AI-powered features, or by enabling integrations with other investigative tools, so that investigators have access to a broader range of functions.
Whatever the route, it’s essential that image forensics platforms respond to new threats, challenges, and opportunities. By being reactive, image forensics tools can help investigators stay proactive – and ultimately create a safer digital world.
We know that the more we understand investigators’ requirements, the better prepared we are to meet them. That’s why we partner with LEAs and governments worldwide to build novel image forensics tools that overcome unique challenges.
If you’d like to expand your investigative technologies, please do get in touch to discuss our R&D capabilities.