Apple has removed references to CSAM on the Child Safety site

Apple stripped all references to CSAM detection on the Child Safety webpage, suggesting plans to abandon this controversial feature that would detect images of child sexual abuse on users’ iPhones and iPads.

Announced in August, Apple’s planned CSAM capabilities include client-side (i.e. device-based) scanning of users’ iCloud photo libraries for images relating to child sexual material (CSAM), so that appropriate authorities can be notified in time. . According to critics, this system is likely to give a window on our iPhones to governments around the world, who may start spying on law-abiding citizens as well. Apple has strongly rejected that possibility, saying it will reject any such request from governments. But for researchers this is not based.

Despite the reassurances, there was no lack of further criticism, so much so that Apple has now decided to remove references to CSAM technology on his Child Safery website. Among other things, on iOS 15.2 functions have been added that allow the Messages app to detect the presence of nudity in images sent to children, but there are no references to CSAM technology.

With this new move, we do not know if Apple will only delay its launch or if it has decided to suspend the initiative altogether.

News