Apple Unveils Plan to Scan iPhones for Child Sexual Abuse Images

Home  »  Sex Crimes   »   Child Sexual Abuse Images Auto-Scanned Through New Apple Tool

Child Sexual Abuse Images Auto-Scanned Through New Apple Tool

Child Sexual Abuse Images Auto-Scanned Through New Apple Tool

Aug 8, 2021

Apple announced plans Thursday to begin testing a new system that will automatically match photos on iPhones and uploaded through iCloud accounts with a database of child sexual abuse images. Through the new system, Apple said it planned to alert authorities as needed.

At a press conference, Apple explained how the new “NeuralHash” tool will detect known images of child sexual abuse. If it finds a match, the image will then be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled, and the National Center for Missing and Exploited Children (NCMEC) will be immediately notified.

Separately, Apple announced plans to scan users’ encrypted messages for sexually explicit content, calling the move a child safety measure. A device will create a doubly-encrypted “safety voucher” – essentially a packet of information sent to various servers – that is encoded on all photos. Once there are a certain number of flagged safety vouchers, Apple’s internal review team will be alerted.

Users who feel their accounts have been mistakenly flagged can file an appeal to have it reinstated.

Our Sexual Assault Justice Experts are here to help survivors secure justice. Contact our top-rated attorneys online or by phone for a free consultation today.

As the Associated Press (AP) reported, parents snapping an innocent photo of a child in the bath “presumably need not worry.” The detection system is designed to only flag images that are already in the center’s database of known child pornography.

But, as the AP also noted, some researchers say the matching tool – which isn’t capable of “seeing” such images, only mathematical “fingerprints” which represent those images – could potentially be utilized for nefarious purposes.

CNN published a quote from Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology:

Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world… Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services,” Nojeim said.

In a post on its website outlining the updates, Apple noted that their new photo scanning method was: “designed with user privacy in mind.”

Two of Dordulian Law Group’s SAJE Team (Sexual Assault Justice Experts) members applauded the move by Apple.

Sam Dordulian, former sex crimes prosecutor and Deputy District Attorney for Los Angeles County, called the initiative “necessary.”

I don’t think people realize just how rampant child sexual abuse truly is, or the prevalence of child pornography, which leads to horrific crimes like sex trafficking and child slavery. Having a check in place to match already identified child sexual abuse images is important as well as necessary,” Dordulian said.

DLG’s in-house Chief Investigator and retired LAPD sex crimes detective in the Abused Child Unit, Moses Castillo, also hailed Apple’s decision.

“I’ve seen the devastation that child sexual abuse can wreak, and any measure that can help prevent someone from becoming a victim should absolutely be implemented. For those who claim this is a privacy intrusion, think about if it were your child who was kidnapped and forced into a life of sex slavery,” Castillo said.

Apple’s NeuralHash announcement was part of a greater overall push by the company to focus on child safety. On Thursday, Apple stated that a new communication tool will also warn users under age 18 when they’re about to send or receive a message with an explicit image. The tool, which must first be activated through Family Sharing, uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit.

Additionally, parents with children under the age of 13 can turn on a notification feature in the event that a child is about to send or receive a nude image. Apple confirmed that it will not have access to those specific messages.

According to Apple, that tool will be available as a future software update.

Our experienced attorneys can help you pursue a financial award for your personal injury case. Contact us online or by phone for a free consultation today.

Child Sexual Abuse Images Auto-Scanned Through New Apple Tool

Obtaining Justice for Child Sexual Abuse Survivors

DLG is California’s leading child sexual abuse firm, offering survivors a four-tiered support network that includes added resources beyond expert legal representation. Our SAJE Team of handpicked professionals is available 24/7 to survivors throughout the legal process (and beyond). We believe that child sexual abuse survivors deserve the absolute best legal representation available – and more.

Led by Sam Dordulian, a former sex crimes prosecutor with more than 100 jury trial victories, DLG provides survivors with unparalleled dedication and personalized attention that is evinced through our experience and proven results:

  • More than $100,000,000 successfully recovered in settlements and verdicts
  • A 98% success record
  • Former Deputy District Attorney for Los Angeles County
  • More than 100 jury trial victories
  • Former sex crimes prosecutor
  • In-house retired LAPD sex crimes detective as Chief Investigator

Contact a member of our SAJE Team today online or by phone at 818-322-4056 for a free consultation. At DLG, we’ll fight aggressively to ensure justice is secured and you recover the maximum financial compensation you deserve for your sexual abuse claim.


Go See Sam