CameraForensics wins Queen’s Award for Enterprise: Innovation
Technological innovation is crucial to the prevention of online child sexual exploitation (CSE).
Abusers are leveraging encryption, anonymisation software and emerging content mediums like live streaming to evade law enforcement. This means the authorities must use purpose-built tools which harness the power of open source intelligence, image analysis software and AI to combat the child sexual abuse material (CSAM) epidemic.
However, question marks still hover above the issue of how law enforcement leverages technological innovation appropriately. Recently, this was highlighted by public criticism of police usage of the advanced facial recognition software platform, Clearview.
Despite the platform’s powerful victim identification capability, security advocates raised questions about the company’s “untested” database of images. Now, it is crucial that child protection stakeholders come together to address this issue.
Because if the integrity of other compliant technology platforms is brought into question, investigators may no longer feel comfortable using them. As a result, they will lose out on the intelligence and efficiency gains they bring. That means giving child abusers the upper hand.
Clearly, this is a scenario that stakeholders want to avoid at all costs. But there is no escaping the fact that negative public relations are costly and could force law enforcement into needless concessions.
On technology, then, we must all commit to proactively asking questions about developments in the field. Answering these questions should always be done collaboratively.
That’s why we’re engaging in the most difficult conversations in online child protection in this piece. By doing so, we hope to encourage debate with our friends and colleagues in the space. We also want to help accelerate the establishment future-proof standards and legislation to secure the role of technology online CSE investigations.
Echoing the sentiments CameraForensics founder, Matt Burns, in The New York Times, we empathise with the position investigators of CSAM are in. It is only natural that tools which give them advantage over abusers will be embraced and used as soon as it is possible to do so.
However, as Matt mentions, there are legitimate concerns about facial recognition software being used beyond its intended purposes. Moreover, the legal implications of its usage are hugely complex. To effectively answer this question, we believe that key stakeholders must consider both sides of the coin.
That means exploring how – and indeed if – it is possible to develop standards that restrict the usage of facial recognition software and other cutting-edge technologies for pre-agreed purposes. Checks and balances, after all, are crucial to public trust.
The moral debate surrounding end-to-end encryption is often framed as an us vs them battle between law enforcement and big tech providers. This was recently brought into focus when US Senators introduced the EARN IT Act, designed to combat online CSE.
The Act’s supporters argue that it makes big tech providers more accountable for the actions of people using their platforms, helping law enforcement to safeguard more children. Detractors, meanwhile, believe it represents a wider attack on online privacy.
What is clear, however, is that players on both sides of the fence have to work together to come to a conclusion – and likely a compromise – on the issue.
A large body of painstaking research has been conducted into online child protection, including:
International organisations who are also working tirelessly to develop global standards include:
Many clear and sensible recommendations have been made by these reports and organisations to foster global and interdisciplinary stakeholder collaboration to help safeguard more victims of CSE. On establishing due process and legislation surrounding technological innovation, we feel that we can add to the conversation.
What could help in the development of these standards is an appropriate degree of public transparency on the techniques law enforcement are using and plan to use to reprimand child abusers. Of course, this must be achieved without compromising covert techniques and investigations.
Because of the speed of technological innovation, the online child protection landscape is constantly mutating. So, when due process and legislation is established, both must reflect this.
Constant reviews conducted by key stakeholders would ensure that standards remain pertinent to the realities of CSE prevention.
This is only possible if stakeholders continue to build on existing collaborations to identify, report and combat emerging trends effectively.
The existence of technology that could help law enforcement is pointless if it doesn’t align with industry workflows. As a result, CameraForensics has always worked with law enforcement colleagues to develop the functionalities they need to save children’s lives.
However, while our professional network is tight-knit, it is impossible for us to procure personal relationships with every global user of our platform. The clear downside of this is that many investigators of CSAM may have access to our technology without knowing how to use it.
That’s why we recently developed a training programme to ensure that law enforcement is properly trained to use features which can generate intelligence and support online investigations.