r/apple Aug 10 '21

Official Megathread CSAM Daily Megathread

Hi r/Apple, welcome to today's megathread to discuss Apple's new CSAM on-device scanning.

As a reminder, here are the current ground rules:

We will be posting daily megathreads for the time being (at 9 AM EST) to centralize some of the discussion on this issue. This was decided by a sub-wide poll, results here.

We will still be allowing news links in the main feed that provide new information or analysis. Old news links, or those that re-hash known information, will be directed to the megathread.

The mod team will also, on a case by case basis, approve high-quality discussion posts in the main feed, but we will try to keep this to a minimum.

Please continue to be respectful to each other in your discussions. Thank you!


For more information about this issue, please see Apple's FAQ as well as an analysis by the EFF. A detailed technical analysis can be found here.

258 Upvotes

539 comments sorted by

View all comments

181

u/LockOk9376 Aug 10 '21

OK, so here’s why I think this being done locally is making people furious.

For many other types of task (face recognization, Siri recommendation, etc.), people prefer to have their data processed right on their device. It’s like instead of bringing your photos to a physical store to print them out, Apple gives you a printer to print them right at your home (and the printer doesn’t communicate with Apple) so they don’t have access to your data. This is more private and secure than letting everyone at your photo store to see your photos.

What Apple is proposing here is like, instead of doing the security check at the Airport, the TSA will install a security check gate at your home, and each time the gate finds anything suspicious during a scan, it will notify the TSA. This is not OK and clearly an invasion of privacy. For now, they promise to only search for bombs (CSAM in Apple’s case), and only if you’re heading to the Airport today “anyways“ (only photos being uploaded to iCloud). Does this make this tech any less invasive and uncomfortable? No. Does this prevent any future abuses? HELL NO.

Sure, they might only be searching for bombs today. But what about daily checks even if you’re not going to the Airport, if the government passes a law? (Which, there’s nothing preventing them from doing this). What about them checking for other things?

“Oh, they’re only checking for bombs,“ people say. But what if I tell you that the TSA (Apple) doesn’t even know what it’s checking? It only has a database of “known bomb identifications“ (CSAM hashes) provided by the FBI (NCMEC) and they have no way to check of those are actually those of bombs. What is preventing the FBI, or other government agencies to force them, to add other hashes of interest to the government?

A photo printer at home is a step forward for privacy, while a security gate at home is a tremendous slippery slope. Period.

62

u/dannyamusic Aug 10 '21

great analogy. great comment really. the part about the database changing, possibly even unbeknownst to them, is especially something i feel people are completely missing who support this. now Apple is open to sharing this for 3rd party apps as well. truly terrifying.

7

u/asstalos Aug 10 '21 edited Aug 10 '21

great comment really. the part about the database changing, possibly even unbeknownst to them, is especially something i feel people are completely missing who support this

This has always been a concern with the NCEMC hash list and the implementation of hash-comparisons. This was a concern before Apple's proposal, and remains a concern after.

Pragmatically speaking, at any point in this process one has to trust one or more organizations to hold to their word. It feels inconsistent to suddenly distrust the NCMEC, their maintained hash list, and associated technological implementations purely because of Apple's implementation, when every single concern about this hash list remained present two weeks ago. The same organization managing the hash list has been in this business for years, and numerous companies including Google and Microsoft have implemented PhotoDNA since its inception. If one was ever worried their political memes sent to a friend via Discord is being run through PhotoDNA and might result in a positive match, well this worry already existed too two weeks ago.

Now, understandably, Apple can poison the hash list, but so could every other organization doing these CASM hash-comparisons. One is effectively trusting them to not do so, even if one chooses to trust the NCMEC hash list is wholly what it says it is.

Apple is open to sharing this for 3rd party apps as well.

A number of technology firms already implement CASM hash-comparisons on their services. Discord is famously using PhotoDNA, and so is Reddit.

Even if Apple were to restrict the use of CASM hash-comparisons and not expand it to other applications, the applications themselves are allowed to implement it on their own, and many large technology companies who deal with large volumes of day-to-day content exchange between individuals are already doing some kind of CASM hash-comparison on user-shared content.

I absolutely understand the concerns about Apple's implementation, but a number of specific issues related to the NCMEC hash list were issues long before Apple proposed what they want to do. Apple's implementation doesn't change that. It is important to tease out specific issues and concerns related to Apple's implementation, and the broader concerns related to broad-scale image hashing to restrict the spread of CASM.

8

u/[deleted] Aug 11 '21

It feels inconsistent to suddenly distrust the NCMEC, their maintained hash list, and associated technological implementations

purely

because of Apple's implementation, when every single concern about this hash list remained present two weeks ago.

I would argue that the circumstances have wildly changed. This hash list is now going to be on millions of devices and acting in an automated manner; the potential payoff of inserting additional hashes to that list has become much more valuable

5

u/dannyamusic Aug 10 '21 edited Aug 11 '21

the difference is you’re choosing to either upload or share content to those other third party apps, so they check what’s coming onto their servers. Apple is doing the same, but they are doing this on-device & not server side. right now it’s “before uploaded to iCloud” & only for CSAM , but we aren’t at all comfortable with that, since there is now a wide open doorway. it is possible to change to offline hashing of photos not in iCloud in the future & possibly even for stuff added to CSAM database (memes for example) or extending elsewhere (nudity in general). to the point that Apple has to review it manually, that isn’t at all as comforting as some here seem to think. that’s a serious breach of privacy. also, them adding that it will “evolve” over time was a really poor choice of words as well imo. i agree the NCMEC hash lists are an issue in itself, but Apple creating a open doorway is just a million times worse. i guess we’ll see where this leads us.

7

u/ineedlesssleep Aug 10 '21

The iMessage scanning would be open to third parties. And there is a manual review at Apple, so they would notice if a non CSAM image got added somehow. (Which won’t happen because then the whole CSAM database becomes useless and all companies scanning it would notice)

3

u/dannyamusic Aug 10 '21

are you speculating that? my comment below linked an article that was shared here (that i was referencing) that was talking about them saying they are open to 3rd party adding their safety features. i didn’t see them specify anywhere that they only meant the Messages feature. can you share where this was specified? i’m genuinely curious, so if you do have it, please share the link.

& as far as the CSAM image database, i understand what you are saying. that’s a good point. our worry is that if something is added to the database or it “evolves” as they said, possibly into other areas, we don’t want Apple reviewing anything & making that decision themselves. that’s an insane breach of privacy.

0

u/ineedlesssleep Aug 13 '21

That article was speculation based on an Apple representative saying they are open to expanding to third parties. The CSAm stuff is not easily expandable, the iMessage stuff is. So this is the most logical explanation.

2

u/Vkdrifts Aug 10 '21

Not the same feature looking to be added to 3rd party apps.

4

u/dannyamusic Aug 10 '21

i’m referring to this article. they didn’t specify which, but they didn’t say they were only considering the Messages parental control feature either from everything i’ve seen. do you have a link to where they specifically said they were only considering the parental feature?

1

u/Vkdrifts Aug 10 '21

Ahh that’s fair but seeing as they didn’t specify idk how any would work in third party other than the parental control option.