Skip to main content Skip to section navigation

New data-driven report shows depth of internet companies’ failure to curb spread of child sexual abuse material

Analysis of 5.4 million images of CSAM reveals disturbing trends, points to need for government action, states Canadian Centre for Child Protection


For Immediate Release

Winnipeg, Canada — The Canadian Centre for Child Protection (C3P) has released a first‑of‑its‑kind report, providing unprecedented insight into the availability of child sexual abuse material (CSAM) and the role vast networks of lesser-known internet companies play in facilitating the spread.

The report’s findings suggest many electronic service providers (ESPs) are failing to use sufficient resources — such as widely available image blocking technology for known CSAM and human moderation — to prohibit and limit CSAM on their platforms.

The Project Arachnid: Online Availability of Child Sexual Abuse Material report leverages data collected from C3P’s CSAM detection technology called Project Arachnid.

Within the three‑year timeframe of the report (2018 to 2020), Project Arachnid found more than 5.4 million verified images of CSAM and otherwise harmful‑abusive content. These records relate to more than 760 ESPs worldwide.

Key findings of the report include:

  • Nearly half of all media detections (48%) are linked to a file‑hosting service operated by publicly traded French telecommunications company Free. Project Arachnid has issued notices on more than 18,000 archives files, collectively containing nearly 1.1 million verified image or video files assessed as CSAM or harmful-abusive content to minors. Project Arachnid routinely encounters multiple access points to these archive files across the web, representing total media detection of more than 2.7 million.
  • The vast majority (97%) of CSAM detected by Project Arachnid is physically hosted on the clear web. However, the dark web plays a disproportionately large role in directing individuals on where to access CSAM on the clear web.
  • Excessive delays in image removal. In some cases (10%) it took 42+ days before content became inaccessible.
  • Nearly half (48%) of all media Project Arachnid has issued a removal notice on, had previously been flagged to the service provider. Certain ESPs have image recidivism rates in excess of 80 percent, meaning that in some cases offending images are repeatedly resurfacing on their systems.
  • Overall, images depicting older adolescents (post-pubescent) take much longer to be removed compared to images with younger victims (pre‑pubescent) and have higher rates of recidivism.

“The findings in our report support what those who work on the frontlines of child protection have intuitively known for a long time — relying on internet companies to voluntarily take action to stop these abuses is not working. This urgently points to the need for policies and regulation that impose accountability requirements on these companies, especially those that allow user-generated content. Children and survivors are paying the price for our collective failure to prioritize their protection and put guardrails around the internet,” says Lianna McDonald, Executive Director for C3P.

The report analyzed records on over five million verified images of CSAM. However, this only represents content Project Arachnid uncovered in specific areas of the clear web, meaning that the true scale of CSAM availability on the internet is far greater than the report’s numbers suggest.

“Despite having nearly two decades to get their houses in order, too many ESPs refuse to deploy proven and practical technological safeguards designed to protect the most vulnerable amongst us. And, although ESPs are quick to innovate to improve their profits, they appear lethargic when it comes to innovating to protect children,” says Dr. Hany Farid, co‑developer of PhotoDNA, and professor at the University of California, Berkeley.

The Project Arachnid: Online Availability of Child Sexual Abuse Material report also provides a series of recommendations for governments interested in establishing effective regulation in the digital space.

Survivors of CSAM are also calling for swift government regulation and industry accountability to ensure more children don’t fall victim to the same traumas.

“Project Arachnid gives us such profound hope that one day the sharing of CSAM can be eliminated and future generations of victims and survivors will not have to endure the trauma we have, but this report is worrisome. 42+ days to remove content is 42+ days these ESPs are enabling crimes against children, and 42+ days that these children will suffer again and again as their abuse continues. The Phoenix 11 stand together again to urge governments to hold ESPs accountable, to ensure prompt and permanent removal of CSAM on their platforms upon first notice.”

Detailed information on the key findings, case studies and recommendations can be found in the full report: protectchildren.ca/PAReport