Gruber: How prepared is Apple for the massive task of reviewing flagged CSAM?

As pointed out in our previous post, Apple is about to enter the big leagues in CSAM (child sexual abuse material) reporting.

John Gruber:

I do wonder though, how prepared Apple is for manually reviewing a potentially staggering number of accounts being correctly flagged. Because Apple doesn’t examine the contents of iCloud Photo Library (or local on-device libraries), I don’t think anyone knows how prevalent CSAM is on iCloud Photos. We know Facebook reported 20 million instances of CSAM to NCMEC last year, and Google reported 546,000.

Fair question. Also makes me wonder how the people who review this sort of material are protected, both emotionally (a dark, dark job, sure to mess with your psyche) and legally (they spend their day looking at illegal material — Are there special laws that protect workers like these?)

I also wonder what that job description looks like. Certainly one of the more unusual job interviews.