Apple iOS15 May Add Pre-Loaded Software to Scan Your Devices for CSAM?

The Economist August 14th 2021 pp23 |United States|Finding abused children| “Hashing ambiguous” “A 38 year old charity will be integrated into Apple’s newest operating system”

Image from

Read The Economist for all the details.

Summary offered by 2244

A 38 year old charity known as The National Center for Missing & Exploited Children [NCMEC] was established and funded by “an act of Congress in 1983 but it is not part of the American Government.” As it is spoken, “nic mic” “operates as a clearing house for information about abducted children. With the increasing use of the internet NCMEC added a focus regarding “harm caused through online activity, specifically the trade of imagery depicting the sexual abuse of children.” It has built a database of CSAM (Child Sexual Abuse Material) and they have now encoded the content of images using “a long, unique string of letters and numbers known as hash for each image...then sharing those hashes with companies that wanted to scan their services for CSAM.” “By reversing the hashing process, companies could scan images on their services just for known CSAM and report anything back to NCMEC.” NCMEC and collaborators are shielded from liability when sharing these images with Law Enforcement and the hashes to companies.

Since the start of this reverse hashing started with business collaborators, the number of tips to NCMEC increased from 220,000 in 2021 to 21.7 million by 2020. Making news now Apple, following the lead that many technology companies “already use NCEMC’s hash database to scan their own cloud servers for CSAM,” announced that iOS15 will come with NCMEC’s hash database pre-installed. This newly added software then will “scan users’ [IPHONE and Laptops etc.] for CSAM automatically.”

This revelation of adding NCMEC's hash database to iOS15 on personal devices has “stimulated fierce debate about whether the new system will provide an avenue to expand their capacity to scan private devices for other illicit content.” Due to the 4th Amendment of the American Constitution, without a warrant “the government could not easily force NCMEC or Apple to tweak this phone-scanning capability to look for other things.” Reportedly, “arrests can be made only because of the voluntary nature of CSAM scanning.” This sort of collaboration “of third parties has long been all that makes it possible for law enforcement to track down child abusers in private spaces.” Will Americans be willing to collaborate in this way on their own devices?