UK Charity Criticizes Apple for Low Reporting of Child Sexual Abuse Material

https://icaro.icaromediagroup.com/system/images/photos/16299208/original/open-uri20240722-55-1bntyhl?1721673436
ICARO Media Group
News
22/07/2024 18h15

Article:

A child protection charity in the UK has accused Apple of lagging behind its peers in addressing the issue of child sexual abuse material (CSAM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC) revealed that Apple reported just 267 suspected CSAM cases worldwide to the National Center for Missing & Exploited Children (NCMEC) last year. In stark comparison, Google reported 1.47 million potential cases, while Meta reported a staggering 30.6 million reports.

Among other platforms, TikTok, X, Snapchat, Xbox, and PlayStation/Sony Interactive Entertainment also reported more potential CSAM cases than Apple in 2023, with numbers ranging from 590,376 to 3,974. These figures show a significant disparity in the reporting of suspected CSAM cases by Apple as compared to its peers.

In a concerning revelation, the NSPCC highlighted that Apple was connected to more CSAM cases (337) in England and Wales between April 2022 and March 2023 than the total number of cases reported worldwide by the company in an entire year. The charity obtained this data through freedom of information requests to local police forces.

Apple's encryption features, such as end-to-end encryption in iMessage, FaceTime, and iCloud, prevent the company from viewing the contents of user-shared data. However, other platforms, like WhatsApp, also have end-to-end encryption and have still reported a significantly higher number of suspected CSAM cases to NCMEC, with WhatsApp reporting around 1.4 million cases in 2023 alone.

Richard Collard, the NSPCC's head of child safety online policy, expressed his concerns over the gap between the number of child abuse image crimes happening on Apple's services in the UK and the limited reports made by the company to authorities worldwide. Collard emphasized that Apple is falling behind its peers in tackling child sexual abuse and urged all tech firms to invest in safety and prepare for the implementation of the Online Safety Act in the UK.

The Online Safety Act, currently in the works, aims to enhance online safety measures and regulations, particularly concerning the spread of harmful content and child sexual abuse material. As the deadline for compliance approaches, tech companies are expected to take active steps in preventing, detecting, and reporting cases of CSAM to appropriate law enforcement agencies globally.

Apple has not yet responded to the NSPCC's claims. The tech giant now faces growing scrutiny and mounting pressure to address the issue of CSAM on its platforms more robustly and ensure the safety of users, particularly children, online.

The views expressed in this article do not reflect the opinion of ICARO, or any of its affiliates.

Related