08/31/2021 / By Mary Villareal
Apple plans to push an intrusive surveillance system to over one billion iPhones it sold over the years. The company is said to roll out this new feature at the launch of iOS 15 around mid-September. The devices of its U.S. user base are designated as the initial targets, with users from other countries spared for a while.
The new surveillance system is expected to prevent users from using their cloud systems from storing digital contraband such as child pornography and other images uploaded by their customers. Under the new design, the new phones will perform searches on Apple’s behalf, so law enforcement will be notified about “forbidden content” before photos or videos even reach the iCloud servers.
This means that Apple plans on erasing the boundaries dividing which devices work for you — as the owner and user — and which devices work for them.
This is especially important because once the precedent is set that it is fit and proper for a pro-privacy company like Apple to make products that can betray their owners, the company itself will lose control over how it will be applied, and experts will begin investigating its technical weaknesses. In many ways, such use can be abused, beginning with the parameters of Apple’s design. (Related: Apple engineered surveillance back door into 600 million iPhones.)
The most contentious component of Apple’s plan for its child sexual abuse material (CSAM) detection system is that it involves the devices matching images against a list of known CSAM image hashes provided by the U.S. National Center for Missing and Exploited Children (NCMEC) and other child safety organizations before it can store images in the iCloud.
Apple noted, “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.”
The device will also create a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. Once a threshold is reached, the company will manually look at the vouchers and review the metadata. If the company determines it is CSAM, the account will be disabled and a report will be sent to NCMEC.
There is, however, a way to exempt a phone from the scans: simply flip the “Disable iCloud Photos” switch, which can bypass the search system. This raises questions of whether or not the company sincerely wanted their design to protect children — or to protect their brand.
Further, abuse cases are easy to imagine in this context: governments that outlaw homosexuality, for instance, can require the company to restrict LGBTQ+ content, or an authoritarian regime may demand to spot satirical images or protest flyers.
WhatsApp head Will Cathcart said that the Facebook-owned platform will not be adopting Apple’s approach and would rely on users reporting material instead. “This is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” he said.
Former Facebook CSA Alex Stamos also questioned Apple’s approach, saying that the company seems desperate to avoid building a real trust and safety function for their communications products. The company’s platforms had no mechanisms to report spam, death threats, hate speech or other kinds of abuse on iMessage.
Instead of Apple’s “non-consensual scanning of local photos,” Stamos said it would have been preferable if Apple has robust reporting in iMessage, child-safety team to investigate reports and roll out client-side machine learning.
Follow the developments of this update at TechGiants.news.
Sources include:
Tagged Under:
Apple, badtech, Cell Phone Dangers, Child abuse, child pornography, CSAM, Facebook, future tech, iMessage, NCMEC, privacy, surveillance, WhatsApp
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 PRIVACY WATCH NEWS