July 2 – Concerns over the detection and management of child sexual abuse material (CSAM) on OnlyFans have surfaced, prompting scrutiny from investigators and experts familiar with the platform.
OnlyFans, based in the UK, voluntarily reports suspected CSAM cases to the National Center for Missing & Exploited Children (NCMEC), despite not being legally required to do so under US law. In 2023, the platform reported 347 instances to NCMEC’s CyberTipline, highlighting its commitment to stringent safety protocols and prompt removal of offending content.
However, verifying the extent of CSAM on OnlyFans remains challenging. The platform acknowledges that many reported cases turn out to be duplicates or non-CSAM content. Compounding this issue are individual paywalls maintained by its 3.2 million creators, which restrict access to their content without subscription.
Trey Amick, director of forensic consultants at Magnet Forensics Inc, emphasized the difficulty: “It’s not just one paywall. It’s a paywall for each and every contributor.” This fragmented access complicates efforts by law enforcement to independently verify potentially illicit material.
OnlyFans asserts that once law enforcement engages in a case, they provide comprehensive cooperation, including account details and content, without the need for subscription. However, the operational challenges posed by paywalls remain a significant hurdle in effectively addressing concerns about CSAM on the platform.
