Suicide, drug abuse, eating disorders and self-harm—the consequences of childhood sexual abuse stretch far into adulthood, but the online distribution of abuse imagery is an immediate turn of the screw for survivors.
Re-victimisation is the process by which survivors of abuse re-live their trauma. Many cannot find closure while images of their abuse circulate online. And those images are no longer circulated only among small networks of abusers – digital platforms have turned the phenomenon into a pandemic.
Between 2014 and 2016, the number websites hosting sexualised imagery of children rose by 417%. A third of those images depicted rape or sexual torture. The problem is not isolated to any one nation: in the USA, the National Centre for Missing and Exploited Children (NCMEC) reviews over 25 million images each year. That’s over 480,000 images per week.
The volume of images is only half the problem, however. The other is geographical. Online paedophile networks crisscross the globe; national governments cannot tackle the problem alone.
Now a Canadian NGO has built a web crawler that identifies images of child sexual abuse and issues automated notices to take them down. Engineered by the Canadian Centre for Child Protection, Arachnid has already analysed more images in six months than the Centre’s Tipline has in the 15 years of its existence. More than a billion URLs have been processed, resulting in over 130,000 takedown notices, almost all of which have been successful.
The belly of the beast
For years, the Centre’s Tipline depended primarily on reports from Canadian citizens and law enforcement.
“Every report felt like a failure,” said Lloyd Richardson, the Centre’s Director of IT and a driving force behind Arachnid’s development.
“We try to stop abuse imagery from becoming what we call ‘popularly traded series.’ When we get reports from members of the public, it means these images already have some degree of traction.”
The thinking behind Arachnid was simple: if technology helps the proliferation of paedophile networks, then technology can become part of the solution.
Arachnid can’t see an image as human eyes would—alone, it cannot determine if a given image depicts child sexual abuse or not. Instead, it works using hash values, strings of code produced by running images through scanning software—a kind of digital fingerprint.
As Arachnid crawls the darker crevices of the web, it scans images to produce hashes before checking them against a master list of known sexual abuse images.
The master list is “triple-vetted”—seen by three human analysts who confirm that an image depicts illegal sexual contact with a minor. This list is a subset of a much longer list of hash values, many of which were shared by law enforcement services and contain images that are not necessarily illegal or abusive.
To avoid the moral and political minefield of ambiguously underage imagery, or images generated consensually by under-18s, Arachnid only sends automated takedown notices to a small subset of the images it scans.
“We wanted to avoid any grey areas. Arachnid only deals with what we call ‘the worst of the worst’, mostly pre-pubescent child sexual abuse imagery involving sexual assault,” Richardson explained.
Traditionally, abusers could evade detection by electronic software through minor alterations to images. Adding a filter, cropping, or altering even one pixel would change the hash value and prevent detection. Arachnid, however, works with Microsoft’s PhotoDNA software to undo alterations that confuse hash values, leaving nowhere to hide for would-be abusers.
“I can’t overstate how difficult a job it is”
A second function of Arachnid is now in pilot. It is being trialled by websites that allow users to upload content to prevent images being uploaded in the first place.
“On any site that allows users to upload content, it’s really a question of when—not if—this material will end up on their site,” said Richardson. As such, the Centre is working with image-hosting sites to scan all user uploads through Arachnid’s database. (Due to the sensitivity of the work, the Centre refused to divulge the names of its industry partners.)
“While it’s only in pilot, the partners who are using our software are using it heavily.” In under six months, some 16.4 million images have been scanned through this API service.
The reports Arachnid generates then filter through the Canadian Centre’s established mechanisms for child protection: where appropriate, images are referred to law enforcement agencies or child welfare services.
Man and machine
The efficiency of Arachnid is a double-edged sword. While images that have been triple-vetted trigger automated takedown notices, images on the longer list of hash values that have not yet undergone quality assurance are referred to the Centre’s team of analysts for assessment.
“Arachnid is very good at stopping our analysts from seeing the same content over and over,” explains Richardson, “but the volume of content that still requires analyst assessment has been unprecedented. I can’t overstate how difficult a job it is.”
Since Arachnid’s inception, analysts have cast over 290,000 votes on images.
‘Megan’, who spoke to Apolitical under the condition that we omit her surname, is a Senior Analyst involved in examining images referred by Arachnid. She has worked at the Tipline for seven years.
“On a daily basis, you are dealing with very sensitive (and often extremely graphic) material. What’s particularly difficult is that you are witnessing a child’s trauma. Quite often, the perpetrator is someone known to the child, someone they trust and, even more troubling, possibly someone they love.”
“In some ways, Arachnid makes our work harder,” admits Megan. “The sheer quantity of images is like nothing we’ve worked with before, and because it only shows us the worst of the worst, the psychological toll is different.”
“Before Arachnid, we would see images that were graphic, but maybe they showed adults or just different kinds of content. Now, because it’s just child abuse, it can become relentless.”
Online paedophile networks crisscross the globe; national governments cannot tackle the problem alone
Arachnid has increased pressures on the Centre’s staff, but Richardson is unequivocal on putting his team before results. “We have a range of safeguarding measures in place. People take breaks, there are shifts, and we have counsellors on hand if anything is particularly traumatic,” he explained.
According to Megan, everyone develops his or her own coping mechanisms.
“It’s graphic, disturbing, heinous, and infuriating. But you take that anger, that rage, and all the other emotions you are feeling, and use it to fuel your drive.”
Members of the Tipline who spoke to Apolitical were clear that technology alone was not a quick fix. Automation only solves part of the problem.
Digital distribution allows child abusers to operate beyond national boundaries as never before. In the 15 years, the Tipline has been in operation, only 15% of the reports that have been forwarded to law enforcement agencies stay within Canada.
An international problem demands an international response, and Arachnid’s success will depend on the willingness of different sectors to collaborate.
To date, the Canadian Centre has received significant funding from a range of different organisations combatting child abuse online. Google provided over $100,000 while the UK Home Office provided over $800,000 to fund the Centre’s work.
More than funding, however, the future of Arachnid depends on data-sharing. The list of hash-values that feed Arachnid includes images registered by Canadian law enforcement, industry partners, Interpol, and the Canadian Centre’s American counterpart, NCMEC.
“The National Centre came on board very quickly when we heard about Arachnid,” explained Lindsey Olson, Executive Director of NCMEC’s Exploited Child Division. “There are a range of organisations working on these issues. There’s law enforcement—and a huge amount of international collaboration between forces—NGOs, tiplines, and industry.”
While collaboration combatting child abuse has rallied various organisations to the cause, barriers still remain in Arachnid’s mission to protect survivors.
“Different places have different laws around the sharing of hash values,” said Richardson. “In the EU, there are different data laws that mean these resources can’t be shared so easily.”
Automation only solves part of the problem
The hope is that Arachnid can unite law enforcement agencies, governments, child protection services and industry partners worldwide, expanding the hash-value list and tackling a global phenomenon with a global solution.
As ever, its success depends on partnerships beyond sectors and borders.
“We’re still very much in the infancy of this project,” Richardson cautions, “but it only has room to grow.”
Picture credit: Pexels and Canadian Centre for Child Protection