Apple's CSAM scanning protections are a fucking joke on so many levels. Would-be p***s would simply just turn off iCloud Photos sync to avoid any trouble, so it's a potential privacy invasion for absolutely no reason.
It's not "child safety" measures it's "how do we protect ourselves from lawsuits"
I'm susprised that this part isn't being talked about more.
@brandon I mean, yeah. You don't. Not with any system. You could try and self-host it, but you're now at the mercy of any ne'er-do-well
who has a 0-day queued up.
Google has been doing this for *ages*. Again, because it's Apple, no one cares. Google isn't sexy enough.
I don’t remember him mentioning potential lawsuits, but did reference that people can just turn off iCloud Photos — which will inevitably lead to calls for scanning *all* photos regardless of whether they’ll be uploaded or not.
@mike I didn't get the chance to finish that one yet but good to know the fact that you can turn it off is mentioned.
The point I'm hammering on here though is that Apple's MO here is being able to turn on E2E encryption in iCloud, but also still prevent themselves from being liable for not doing "enough" from a legal standpoint
@brandon AFAIK they already scan you media in iCloud (for several years now). Now they are pushing hashing images locally and if you have several matches (nobody knows how many exactly) they will trigger alarm.
@efftoyz sure but now they’re basically saying “p***s who weren’t going to get caught with the old method are still not going to be caught with the new one” great stuff Apple
@brandon well, not exactly. You can still trigger alarm (which works only for known images BTW), but what is much more concerning that some totalitarian government can slip in some extra hashes to database and find free thinkers who shares some memes.
Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.