Cyacomb, Internet Watch Foundation (IWF) and Blipfoto have gone live with a pilot demonstrating new technology that allows small platforms to block known Child Sexual Abuse Material.
Preventing the spread of known Child Sexual Abuse Material (CSAM) is vital, but smaller platforms often find even basic measures impractical to implement. Cyacomb Safety allows platforms to match the content their users upload or share against IWF data while protecting their users’ privacy. Integration is straightforward, and the approach is highly scalable.
“It’s hugely reassuring to know that offenders can’t misuse the private spaces our community offers to share Child Sexual Abuse Material, and it gives us confidence that we are meeting our obligations to protect children.”
Annie Andrews, Director, Blipfoto
Blocking Child Sexual Abuse Material (CSAM) is impractical for small platforms.
Blocking the spread of known Child Sexual Abuse Material (CSAM) is vital to reduce harm to survivors caused by images and video of them continuing to circulate, as well as reducing harms that arise from unintended viewing, desensitisation and demand creation.
The Internet Watch Foundation supplies data in the form of a hash list to many organisations to allow them to block known CSAM. However, many smaller platforms remain completely unprotected.
A good example of such a platform is Blipfoto, a community-owned, non-profit photo journal website run by volunteers. Blipfoto has private spaces that could, in theory, be used by offenders to exchange CSAM, although there is no evidence this has happened to date. Blipfoto lacks the cybersecurity expertise to protect the IWF hash list so would not be allowed access to it, and even if this were not the case, the traditional IWF membership model would be unaffordable without a significant and unsustainable increase in Blipfoto’s pricing. While there are cloud services that allow CSAM detection, these require user data to be sent to a third party, which gives rise to privacy concerns and data protection complexity. As a result, Blipfoto has, until recently, been unprotected.
Ofcom recognises that smaller sites play an important role in the proliferation of CSAM in its published illegal harms consultation, with offenders identifying a vulnerable site and targeting them until they are discovered, then moving on to the next. However, it also recognises the challenges for organisations like IWF scaling to serve many thousands of smaller sites, and the potential implementation costs to those sites. The initial Illegal Harms code is therefore
expected to exclude all but the largest sites from the obligation to block known CSAM, in spite of the significant harms that arise on smaller platforms.
Cyacomb Safety is a simple solution for small platforms to block CSAM.
Cyacomb Safety has been created to allow platforms to detect CSAM with an extremely simple integration that avoids the need for complex relationships or new levels of cybersecurity or cyber governance. Cyacomb supplies an integration SDK, and the platform only needs to obtain an access key and add a handful of lines of code to implement detection. Cyacomb provides a matching SaaS in the cloud, which the SDK connects to as part of the matching process.
Protection of user data is guaranteed through Cycomb’s privacy-by-design approach, which ensures that no user data ever leaves the platform. This ensures no third party can gain any knowledge of what a user is posting to a platform or track or match that content as it moves around the internet. Even the result of the matching process is known only to the platform, where appropriate action can be taken.
Internet Watch Foundation data is also protected in our cloud service where it is only held in the form of a secure-by-design Contraband Filter, ensuring that it can’t be misused even if the conventional cybersecurity applied to it were breached.
Empowering Regulators with New Evidence
Ofcom’s Illegal Harms Consultation estimated the cost to platforms of implementing CSAM detection as between £16,000 and £319,000, incorporating 2 to 18 months of full-time work for a software engineer as well as input from product managers and lawyers. This pilot demonstrates that for a small platform, technical integration of Cyacomb Safety can cost well under £5,000 and take less than two weeks of full-time work. As a result of the privacy-by-design and security-by-design approach, no formal legal input is needed.
Ofcom also stated that:
“In principle, we provisionally consider that, even where they are very small, it could be justified to recommend that services which are high-risk to deploy these technologies.”
However, it went on to say that smaller services would not be included in regulation at this point:
“… because to implement hash matching and URL detection services will need access to third party databases with records of known CSAM images and lists of URLs associated with CSAM. There are only a limited number of providers of these databases, and they only have the capacity to serve a finite number of clients. Setting the user-number thresholds we have proposed should ensure that the database providers have the capacity to serve all services in
the scope of the measure. Should the capacity of database providers expand over time, we will look to review whether the proposed threshold remains appropriate.”
This pilot project has demonstrated that where security and privacy are designed into a solution many of the practical and governance barriers to scaling are removed, effectively increasing the capacity available to service the need of platforms.
“It is a strategic priority for the IWF to find ways to deliver its services at a greater scale and to ensure that offenders are denied the use of as many platforms as possible. This pilot project has helped us look differently at how we can scale, addressing many of the legal, governance and technical barriers that we face today.”
Dan Sexton, CTO, Internet Watch Foundation