User-led insights guide design of tech aimed at preventing the viewing of child sexual abuse images and videos

February 22, 2024

By Prof Sam Lundrigan (Director of the Policing Institute for the Eastern Region) and Dr Deanna Davy (Senior Researcher, Policing Institute for the Eastern Region), Anglia Ruskin University.

How do we encourage people to use an app designed to help stop them viewing child sexual abuse images and videos? This is one of many questions at the core of the Protech project – aimed at tackling the proliferation of child sexual abuse materials (CSAM) on the internet, and the demand that lies behind it.

The intended users for the app are those people who’ve acknowledged they’re at risk of viewing this criminal content, whether it be on their mobile phone, tablet or personal computer.

The app is called Salus, a cutting-edge piece of technology developed to identify and stop sexual images of children from reaching the screen of the user’s device before the user can even see it. A message will then appear to let users know that content has been blocked.

For Salus to be effective, it is essential we gain an understanding of what the app should look like, and how it should function, from those individuals who want help to stop viewing this material.

Our recent briefing paper for Protech details how we gathered information from people in Germany, Belgium, the UK and Ireland who have self-reported that they are at risk of viewing child sexual abuse images and videos. We also spoke to service providers of treatment programmes – therapists and managers who work to help people deal with their desire to view this type of content.

Our findings show that, given the nature of the content that Salus aims to block, privacy is extremely important to potential Salus users. For many, the discovery and identification of the app on the user’s device, particularly by intimate partners and other family members was of great concern.

Equally of concern, was that that the app may be used against the user, by either the sharing of background data with third parties or as the result of a data breach, and its possible discovery by police if they inspected the device.

To bring potential users on board, the Protech project will need to address these concerns within the design of Salus. It will be vital to provide a concise and clear privacy policy for users, outlining how the app’s data will be handled and stored, and emphasising that no legal consequences will arise from use of the app.

Once a user has access to Salus, how do we ensure that it works for them in the ‘right’ way? Our interviews revealed that how, or what, content is blocked by the app is dependent on the user.

Most interview participants called for the app to also provide the option to block materials that are not directly related to child sexual abuse, that is, to allow for adult pornographic content to be blocked as well. Many felt that adult content desensitised them to images and videos of child sexual abuse and made it easier for them to start the offending behaviour.

Opinions differed on the tone of the message to display to users once content had been blocked on their device. The blocking message could be seen as helpful, or not, depending on whether it was either: vague, by not revealing Salus’s purpose; shocking or deterring, by emphasising the possible negative consequences of offending; or positive and supportive, by pointing out more resources for support, for example. At a minimum, if a standardised blocking message is used, it should be non-threatening.

These findings highlight that an ability to customise blocking messages within the app, as well as to choose which additional content, other than sexual images of children, is blocked, are highly necessary for the app as part of a user treatment programme. The Salus prototype, or future versions of it, will need to have an optional function to also block adult content. The appropriate blocking settings could be discussed and agreed upon between the user and their service provider before using the app.

Two other aspects of the app’s development were also discussed in the interviews: interactivity and deployment.

Interactivity options for a future version of Salus could include, for example, access to a journal (for reflecting on, for example, how the app is benefiting the user), or user statistics (on, for example, how many times they received the blocking alert), which were broadly welcomed as having potential benefits for users.

How the app was deployed, so that the user could access it, also threw up differing opinions and concerns over privacy. How for example could a user download the app without jeopardising their anonymity?

This concern is a priority for the Protech team, and various options will need to be considered to ensure the most secure method of deploying Salus, such as providing a secure link to download the app or providing an offline alternative, via a memory stick for example.

The data collected through the interviews are invaluable for guiding and informing the functioning of the Salus prototype, ensuring that the app is truly based on a participatory, user-centred design approach. The app will soon be ready for testing and is intended to be rolled out as part of a pilot across intervention settings in five European countries in early 2024.

Tackling CSAM is challenging and complex; there is no one solution for solving its consumption and dissemination. However, user-centred design technologies such as Salus, which harness artificial intelligence and machine learning to block sexual images of children, could be instrumental in supporting individuals to stop viewing this type of material.