As a follow up our earlier piece on Apple’s well-intentioned but dangerous privacy-violating app update, Apple defends its planned scanning of phones for possible child sex abuse photos, after a company memo called privacy advocates “screeching voices.”

On Monday it responded to critics by stating it will refuse any government demands to expand its new photo-scanning technology beyond the current plan of using it only to detect CSAM (child sexual abuse material).

However, the Big Tech giant’s promises likely won’t convince many privacy advocates.

As ADN reported:

Apple is reportedly planning an iPhone update with something called a “neuralMatch” Artificial Intelligence (AI) system to allow it to “continuously scan photos that are stored on a US user’s iPhone and [photos that] have also been uploaded to its iCloud back-up system.”

It would scan the iPhones and iCloud for images of child sexual abuse in an attempt to help law enforcement track down child sex abusers.

While this effort is extremely well-intentioned, it opens a dangerous Pandora’s Box of security and privacy concerns.

ARS Technica noted that:

Apple has faced days of criticism from security experts, privacy advocates, and privacy-minded users over the plan it announced Thursday, in which iPhones and other Apple devices will scan photos before they are uploaded to iCloud. Many critics pointed out that once the technology is on consumer devices, it won’t be difficult for Apple to expand it beyond the detection of CSAM in response to government demands for broader surveillance.

In a FAQ released today by Apple with the title, “Expanded Protections for Children,” there is a question that asks, “Could governments force Apple to add non-CSAM images to the hash list?” Apple answered the question with this reply:

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC (National Center for Missing and Exploited Children) and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.

Of course, none of this means that Apple lacks the ability to expand the technology’s uses. The system’s current design doesn’t prevent it from being redesigned and used for other purposes in the future. This entire effort itself is a major change for a company that has used privacy as a selling point for years and calls privacy a “fundamental human right.”

The real question about Apple succumbing to government demands for expanded scanning of users’ phones for other purposes should be not “if”, but “when?”




Comments

  1. To protect the children is always a top priority. What about the thousands of phones being scanned for no reason?
    That’s a good question. There’s security issues that might lead hackers where they never thought of looking before.
    It’s a good idea, unless you know the security risks.

  2. I thought apple liked the communist democrat cut party, now what will these communist democrat cult party pedophiles do to store their pictures!

Leave a Reply

Your email address will not be published. Required fields are marked *