5 Key Recommendations from the Child Dignity Alliance Report

21 December, 2018

CameraForensics

5 Key Recommendations from the Child Dignity Alliance Report

The Child Dignity Alliance Report, released in November last year, focused on the importance of global collaboration and the standardisation of Child Sexual Exploitation (CSE) legislation. It also urged tech and social media companies to do more to combat the scourge of online crimes against children.

Though comprehensive legislation does exist, many harmful silos and unclear definitions are still found, hampering law enforcement agents (LEAs) in their battle to safeguard children and prosecute perpetrators of CSE across borders. Here are some of the key recommendations from the report, which – if implemented – should help LEAs to optimise their resources and garner even better results:

Embracing innovation

“As a global community, we need to embrace innovation, safety and agility as core tenets of technology design.”

LEAs working in Child Protection are receptive to cutting-edge technologies which help them to work smarter. However, there remain justifiable concerns about data sharing and the potential impacts of a compromised database. The report suggests that well-vetted members of industry should not be refused access to training data and that by doing so, LEAs will be able to leverage the full force of technological advancements in the fight against CSE.

Complying with legislation

“Governments, through the Child Dignity Alliance and the WePROTECT Global Alliance, should immediately commence work on ensuring that their domestic legislative frameworks comply with the International Centre for Missing and Exploited Children’s Model Legislation (ICMEC).”

Without a global standard, perpetrators of CSE will be subject to different – and perhaps more lenient – legislation, which could result in the prolonging of abuse and the further circulation of abuse material. The ninth edition of the ICMEC’s global review, for example, found that while the simple possession of Child Sexual Abuse Material (CSAM) is criminalised in 140 countries and 125 countries have a definition for CSAM, only 32 require ISP reporting of suspected CSAM.

Treating technical data as global assets

“Technical data of any kind, including hash data sets, relating to child sexual exploitation and abuse imagery should be defined and treated as ‘global assets’.”

Victim identification is one of the greatest challenges facing LEAs investigating CSE. By encouraging data sharing across borders to trusted individuals in industry, the report suggested that innovation would be made easier and “new identification/classification technologies” could be developed.

RELATED: Big Data and the Prevention of Child Sexual Exploitation

Internationally standardising the classification of CSE imagery

“There should be efforts made at the supra-national level, through appropriate multilateral working groups, to implement a single standard framework for the classification of child abuse imagery.”

The report recommends that the ‘Five Eyes’ countries (Australia, the United Kingdom, Canada, New Zealand and the United States) should take the lead in supporting the “development of tools allowing for the ‘translation’ of categories from one jurisdiction to another”. With classifications that align with the Luxembourg Guidelines, global standards can be established and a truly global response to these heinous crimes can be directed.

Building bridges between major tech companies and LEAs

“Facebook, Twitter, Google, Snap, Microsoft and others should continue to support the efforts of law enforcement, government entities and not-for-profit hotlines, including through sharing enhanced key technical and operational data, for example hash data.”

To enshrine good practice into the internal infrastructures of major tech companies, the report suggests that they should be strongly encouraged, if not legally bound to adopting digital forensics tools help to detect CSE imagery and report those responsible. This would be part of a wider move to collect better data about their users in order to prevent CSE.

Conclusions

Should the recommendations from the report be accepted and implemented across borders and industries, the fight against CSE could truly be taken to the next level. By shining a light on the shadows in which abusers operate, an unprecedented number of children could be safeguarded, benefitting not only them, but society as a whole.

The full report can be found here.


Subscribe to the Newsletter