https://store-images.s-microsoft.com/image/apps.2518.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.a4983121-86f7-4bde-a1e1-3332bacdbc3d

Data Scoring as a Service CSAM text and link detection

par Vistalworks

Risk scoring data service detects patterns in datasets associated with CSAM text and link sharing

Disturbing illegal content, including CSAM, is more available on the open web than ever before. But the criminal distributors know that sharing images and videos directly is high-risk and that innovative tools exist to catch them. So they’ve changed their behaviour. They’ve started sharing text sign-posts and text links to illegal content instead, and exploit mainstream web platforms, marketplaces, and social media services to distribute their illegal content.

This has serious reputation and legal implications for organisations who do not identify and tackle CSAM link sharing on their digital services. New regulations such as the UK Online Safety Act and the EU Digital Services Act (DSA) have introduced fines of 6%-10% of global annual revenues for platforms who do not effectively protect the public from harmful and illegal content.

Vistalworks has spent the last year working on the UK Government’s Safety Tech Challenge to develop technology disrupt the sharing of text links to CSAM, in association with enforcement and intelligence agencies such as GCHQ and the Home Office. We’ve identified new and rapidly evolving ways that offenders are exploiting the open web to increase their reach, pulling new vulnerable people into criminality and exposing mainstream platforms and digital service providers to serious legal risk.

Existing image matching and basic keyword or URL block lists are no longer enough to prevent distribution of illegal materials. Offenders are constantly adapting, using a range of covert and hard to detect tactics to exploit the open web to distribute links to CSAM.

Vistalworks’ solution
detects and risk-profiles non-image based indicators of CSAM distribution in client datasets. The subscription service is contextual, continually updated, and informed by expert behavioural research and specialist law enforcement input.

Our innovative solution, Data Scoring As A Service for CSAM text and text link detection, is particularly relevant to:

  • Public sector, including online safety regulators and specialist enforcement

  • NGOs and specialist eco-system service providers with a CSAM specific remit

  • Search engines, platforms, marketplaces and similar online services indexing and/or storing text and/or serving auto-generated prompts and recommendations

  • Discussion forums, social media platforms, communities, and similar web-publishing services with a text component

  • Platforms, digital services, marketplaces and online communities whose end-users are vulnerable to targeting by offenders associated with child exploitation and CSAM.


Key Features

  • Reduces the risk of inadvertently publishing and distributing illegal material by detecting evasive and evolving characteristics of CSAM text and link sharing in client datasets

  • Contextual to reduce false positives, with algorithms continually updated and adaptive to mitigate offender responses to removal

  • Informed and updated by expert behavioural researchers and offender profiling, with specialist legal and law enforcement input

  • Accurate in detecting high-risk and context dependent terms, phrases, behaviours and links in small, sparse and large datasets - including search engine indexes, chat and comment threads, marketplace listings, and generative AI outputs

  • Available as secure bulk upload/download service, allowing data owners to securely transfer CSV files (or similar) through Azure Cloud for rapid automated scoring by Vistalworks

  • API and custom case management systems integrations also available

  • Full out-sourcing of end-to-end process available, with consultancy style findings-only reporting if required

The underpinning risk-analysis models retain a ‘human in the loop’ and use direct input from subject matter specialists. This means the service is not classified as an AI System under EU Regulations and is therefore eligible for use by the public sector in an enforcement and investigation capacity.


This is a text and text links analysis service (so does not involve the viewing or transfer of high-risk images). But reports do contain upsetting data related to extreme criminal activity and should be handled in accordance with local laws, security and staff well-being best practices. Vistalworks can help if you do not have the internal expertise or processes to manage this.

Vue d’ensemble

https://store-images.s-microsoft.com/image/apps.36872.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.461b537e-e714-4fc4-ae00-51d460f5ecf8
/staticstorage/5610156/assets/videoOverlay_7299e00c2e43a32cf9fa.png
https://store-images.s-microsoft.com/image/apps.1165.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.84dcef85-1282-41eb-b319-a2afdc3d2c8a
https://store-images.s-microsoft.com/image/apps.13335.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.a07427ad-1650-4cbf-bf70-8fe95922069d
https://store-images.s-microsoft.com/image/apps.56012.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.77801fca-8ca2-4cea-829c-fd2443dc19f7
https://store-images.s-microsoft.com/image/apps.19307.2b502ec3-5bdc-44f6-a08c-e4e1f12527e0.8952b88b-1137-4155-86b2-2335bd1e0c90.c3696db5-b35f-411b-af2f-f9b4a13f279a