Safeguarding Blog Curriculum Blog

Demos launches technology briefing on child sexual abuse imagery

Last night I attended the Westminster launch of a new ‘Technology Briefing Series’ from cross-party think-tank Demos. The first paper in this series was a joint effort with the Internet Watch Foundation (IWF) and covered the topic of Online Child Sexual Abuse Imagery (CSAI). The briefing comes while the Government is currently considering responses to the recent green paper on the Internet Safety Strategy, which contained a number of proposals for social media companies in particular “to do more”. The event included a panel discussion with Jamie Bartlett, MPs Yvette Cooper and Vicky Ford, Karim Palant from Facebook and Andrew Puddephatt, the new IWF Chairperson.



The event and the report celebrated the fact that IWF has been instrumental in ensuring that less than 0.1 percent of CSAI content is now hosted in the UK, down from 18 percent in 1996! Alex Krasodomski-Jones from Demos, said: “Technology policy is challenging: it tests our ability as a society and democracy to grapple with difficult problems and find sensible solutions. Demos is committed to improving the public conversation around these issues, to bringing expert voices to the debate, and to help inform difficult decisions. In partnership with the IWF, we are calling for a better dialogue between politicians, experts, the media and the public around technology, its impact on our lives and our democracy. In doing so, we hope to encourage good solutions to complicated issues.”

But as Jamie Bartlett pointed out, whilst it is a hard truth to accept, “the problem is not going to go away”. You can view the full report here, which is definitely worth a read. It tells how the fight against online child sexual abuse content is being won in the UK, but the global threat remains as big as ever.


The speakers at the event yesterday highlighted how the fight against CSAI images is very different to the fight against radical or extremist text, images or videos, largely because of the lack of clarity and legal frameworks and definitions. Yet at the same time, it was pointed out that there were clear lessons to be learned from this area, for example, on how processes and technologies developed by experts at IWF and Microsoft ensure that an image or video, once flagged, cannot resurface (Yvette Cooper MP pointed that this the case currently for certain far-right material).

There was plenty of food for thought from the panelists that I will be digesting over coming months in terms of hate speech and radicalisation. For example, Yvette Cooper pointed out that due to automatic ‘learning’ from our online searches, “In some cases the algorithms are doing the radicalising”. Andrew Puddephatt: “Algorithms don’t do context, it’s important a human analyst makes the decisions about what should or should not be removed. Those analysts should be supported and accountable.”

And as Jamie Bartlett pointed out, “It’s not always radical content that radicalises people; it can be very ordinary things”.

 

Enough from me now – head over to read the Demos report here!

Subscribe to our blog

Enter your email address to receive notifications of new posts.