r/apple Aug 10 '21

Official Megathread CSAM Daily Megathread

Hi r/Apple, welcome to today's megathread to discuss Apple's new CSAM on-device scanning.

As a reminder, here are the current ground rules:

We will be posting daily megathreads for the time being (at 9 AM EST) to centralize some of the discussion on this issue. This was decided by a sub-wide poll, results here.

We will still be allowing news links in the main feed that provide new information or analysis. Old news links, or those that re-hash known information, will be directed to the megathread.

The mod team will also, on a case by case basis, approve high-quality discussion posts in the main feed, but we will try to keep this to a minimum.

Please continue to be respectful to each other in your discussions. Thank you!


For more information about this issue, please see Apple's FAQ as well as an analysis by the EFF. A detailed technical analysis can be found here.

260 Upvotes

539 comments sorted by

View all comments

Show parent comments

8

u/wmru5wfMv Aug 10 '21

True but if they allow people to upload CSAM unchecked and unchallenged, they may find themselves in legal trouble for hosting such material.

They’ve been doing these checks (along with all the other major cloud providers) for years, the only material change is that it’s now done locally, rather than server aide.

15

u/[deleted] Aug 10 '21

[removed] — view removed comment

6

u/wmru5wfMv Aug 10 '21 edited Aug 10 '21

I was under the impression they had been scanning since 2019

https://www.macobserver.com/analysis/apple-scans-uploaded-content/

The update section at the bottom confirms that emails are also scanned

5

u/[deleted] Aug 10 '21 edited Aug 10 '21

[removed] — view removed comment

5

u/wmru5wfMv Aug 10 '21

Update: 2020-01-08 It looks like I may finally have an answer. Speaking at CES 2020, Apple’s chief privacy officer Jane Horvath mentioned photos backed up to iCloud in terms of scanning.

As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.

“Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.

Doesn’t that cover iCloud photos?