Apple delays plans for iCloud scanning of CSAM

Last month, Apple announced a new capability for iCloud through which it would scan the hashes of uploaded images and compare them to those of known images of child sexual abuse material (CSAM). If a hit was determined, authorities would be involved in the matter. This system was a combination of automation and human moderation. Following some backlash, Apple has now decided to delay its rollout.

Apple products in classroom

In an updated statement that can be seen on Apple’s documentation for child safety features, the company has noted that:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Although Apple’s NeuralHash algorithm for CSAM detection computed hashes of images and then scanned them against known CSAM images rather than comparing the whole image itself, it received a lot of backlash from privacy advocates citing potential for abuse, especially given its global scale. Many researchers reverse-engineered the Cupertino tech giant’s solution and found out that hash collision is quite possible in seemingly different images.

Back then, Apple had privately remarked that it understands that people are worried but that is because there are a lot of misunderstandings. It even released a six-page document explaining the entire solution, but clearly, that is not enough for privacy advocates. As a result, Apple has now decided to delay the rollout of this feature by at least a few months.

Leave a Reply

Your email address will not be published. Required fields are marked *