Joanna Stern, Wall Street Journal, got the chance to interview Craig Federighi about the backlash in response to Apple CSAM scanning plans.
The video embedded below is worth watching, both for Craig’s take on what Apple did wrong in the rollout, but also for Joanna’s excellent pause and explain take on Craig’s response, on the difference between what’s happening in Messages and the neural hash analysis in the cloud, and what Apple is really doing here. Really, really good.
One piece of this puzzle is who gets notified when a CSAM image is flagged.
Benjamin Mayo, from this explainer:
I think there’s a reasonable worry that a government could use this as a way to shuttle other kinds of content detection through the system, by simply providing hashes of images of political activism or democracy or whatever some dictatorial leader would like to oppress. Apple’s defence for this is that all flagged images are first sent to Apple for human review, before being sent on.
That cuts to the core of the problem many people have with this approach, the “slippery slope” argument. Apple is saying, we won’t let that happen. The argument is, the chances of a false positive are already really low, and the images being sent on to Apple (imagine being on that particular image review team) for review raise the bar even further.
One issue here is, who gets notified if a CSAM matching image is found? As is, seems the notification happens behind the scenes.
Back to Benjamin:
My suggestion would be that all flagged images are reported to the user. That way, the system cannot be misused in secret. This could be built in the software stack itself, such that nothing is sent onward unless the user is notified. In press briefings, Apple has said they don’t want to do this because their privacy policy doesn’t allow them to retain user data, enabling a legitimate criminal who is sharing CSAM would simply be able to delete their photo library when alerted. I think tweaks to policy could solve it. For instance, it would be very reasonable for a flagged image to be automatically marked frozen in iCloud, unable to be deleted by a user, until it has gone through the review process. The additional layer of transparency is beneficial.
Thoughtful take. Worth reading the whole piece, worth watching Joanna Stern’s interview with Craig Federighi, embedded below.