WASHINGTON (AFP): Apple’s announcement that it would scan encrypted messages for evidence of child sexual abuse has revived debate on online encryption and privacy, raising fears the same technology could be used for government surveillance.
The iPhone maker said its initiative would help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material.
The move represents a major shift for Apple, which has until recently resisted efforts to weaken its encryption that prevents third parties from seeing private messages. Apple argued in a technical paper that the technology developed by cryptographic experts is secure, and is expressly designed to preserve user privacy.
The company said it will have limited access to the violating images which would be flagged to the National Center for Missing and Exploited Children, a nonprofit organization.
Nonetheless, encryption and private specialists warned the tool could be exploited for other purposes, potentially opening a door to mass surveillance. “This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?” said a tweet from Matthew Green, a cryptographer at Johns Hopkins University.
Others warned that the move could be a first step toward weakening encryption and opening back doors which could be exploited by hackers or governments. “There’s going to be enormous pressure on Apple from governments around the world to expand this capability to detect other kinds of ‘bad’ content, and significant interest by attackers across the spectrum in finding ways to exploit it,” tweeted Matt Blaze, a Georgetown University computer scientist and cryptography researcher.
Read more: ‘Epic is right’: Musk takes sides in battle with Apple
Blaze said the implementation is potentially very risky because Apple has moved from scanning data on services to the phone itself and has potential access to all your local data.
The new image-monitoring feature is part of a series of tools heading to Apple mobile devices, according to the company.
Apple’s texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.