Apple Wants To Scan People's Phones For Child Sex Abuse Images

The tech giant hopes it will be able to detect and then report such material to the police.
Apple's new tool will scan photos uploaded to the iCloud
Apple's new tool will scan photos uploaded to the iCloud
SOPA Images via Getty Images

Apple will start rolling out a new function which will allow it to alert child sexual abuse content to law enforcement later this year.

The company said it hopes this technology, called NeuralHash, will maintain user privacy while detecting the child sexual abuse material (CSAM).

It is a particular significant move from the tech titan, as Apple have previously entrusted users with the chance to encrypt their own data before it reaches its iCloud servers.

But, NeuralHash can identify if images related to child abuse are uploaded to iCloud without having to decrypt them first.

How does it work?

The function operates by turning iPhone or Mac photographs into a string of letters and numbers – similar images should result in the same code.

This is then matched against the codes provided by child protection organisations to identify the indecent images without alerting the user or revealing the image.

Only Apple can then decrypt and verify the content, disable the account and report the content.

The tech creator is not alone in scanning its user’s files for illegal activity – Dropbox, Google and Microsoft all do the same – but Apple has previously pushed back against scanning files before allowing them to go into the iCloud over users’ privacy fears.

The new technology will be part of the iPhone update iOS 15 and its desktop counterpart macOS Monterey, but will only be rolled out in the US for now.

The company has also said there is a one in a trillion chance of a false positive but have put an appeals process in place.

Apple will be rolling out the new feature in the US later this year
Apple will be rolling out the new feature in the US later this year
Jason Lam via Getty Images

This new tool has raised privacy concerns

While Apple’s efforts to tackle CSAM have been widely praised, some in the tech field are concerned about the element of surveillance involved with this new tool.

Matthew Green, who teaches cryptography at Johns Hopkins University, tweeted that scanning functions are “not dissimilar to the tools that repressive regimes have deployed – just turned to different purposes”.

These are bad things. I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.

— Matthew Green (@matthew_d_green) August 5, 2021

He also pointed out that the new technology does not stop users making “problematic images that ‘match’ entirely harmless images”.

Green added: “Regardless of what Apple’s long term plans are, they’ve sent a very clear signal.

“In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.”

He claimed this could set a dangerous precedent across the world, as it could mean “governments will demand it from everyone” in time.

Security agencies and law enforcement services have also been trying to encourage tech giants to remove encryption from their services for years, in a bid to uncover users’ behaviour, but Apple has regularly pushed back.

Others are alarmed that the tech giant did not launch a public discussion about the matter before rolling it out, and that it could be misused by flooding victims with CSAM.

The company defended itself by saying a manual review looks at all of the the evidence when the system alerts it to a particular account’s activity.

Apple also maintained that the feature is optional; users do not have to use iCloud Photos, where the new tool will operate.

I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.

— Matthew Green (@matthew_d_green) August 4, 2021

Apple’s other plans to protect children

The feature is one of several which Apple is introducing in a bid to protect the children who use its services to get onto the internet from harm.

It will also be introducing filters which can block potentially sexually explicit images sent and received through a child’s iMessage account.

Apple also plans to introduce a new function which intervenes when a user attempts to search for terms related to CSAM through Siri and Search.

  • Childline - free and confidential support for young people in the UK - 0800 1111
Close

What's Hot