Digital rights groups say Apple’s new child protection policy weakens user privacy News
matcuz / Pixabay
Digital rights groups say Apple’s new child protection policy weakens user privacy

Apple announced Thursday a new policy for child-safety measures on its devices, leading several digital rights organisations such as the European Digital Rights network (EDRi) and Electronic Frontier Foundation to raise concerns over the privacy and security of the company’s large global customer base.

The new policy scans messages sent or received by a minor’s account to warn children and their parents when they are receiving or sending sexually explicit photos. When minors receive such photos, the photo will be blurred. The children will be assured that it is alright if they do not want to view such pictures and will be provided with helpful resources. When sending such photos, the children will be similarly warned. Parents can also be notified if their child decides to view or send sexually explicit photos.

Additionally, Apple will scan photos uploaded by users in iCloud Photos to identify Child Sexual Abuse Material (CSAM) and report these instances to the National Center for Missing and Exploited Children (NCMEC). Apple will match content against an unreadable database of known CSAM image hashes provided by child safety organizations that will be stored in the operating system of users’ devices.

Where a match is found, a cryptographic safety voucher encoding the match will be uploaded to iCloud Photos along with the image. Apple will not be able to interpret the contents of these safety vouchers unless a high threshold limit preset by the company is breached by a user, after which it will manually review each match, disable the users’ account and send a report to the NCMEC. 

The EDRi and EFF recognise the serious problem posed by online child exploitation, but argue that the changes “build a backdoor into [Apple’s] data storage system and its messaging system” and it is “impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.” Rather, they say, Apple is compromising end-to-end encryption that protects citizens against state surveillance. Additional tweaking or expansion in machine learning can be used to scan all users’ devices and open the floodgates for misuse by authoritarian regimes.

They also highlighted that machine learning, when used without human oversight, habitually classifies content incorrectly, including sexually explicit content, and employing such tools to scan users’ iCloud Photos will have a “chilling effect.”

The new policy has also drawn criticism from experts such as Edward Snowden, Matthew Green, and Kendra Albert. WhatsApp CEO Will Cathcart has said his company will not implement this policy. A consortium of legal experts, cryptographers, researchers, professors, and Apple consumers have also written an open letter asking the company to halt the deployment of the new policy and reaffirm its commitment to end-to-end encryption and to user privacy.