Nick Heer, on the FBI asking Apple for a backdoor version of iOS:
At no point — then or now — has Cook or anyone at Apple publicly confirmed how such a backdoor may be installed, or if it’s even possible. Presumably, it would use the iOS update mechanism, but how could permission be granted if the passcode to the iPhone isn’t known?
Nick then takes a Mac with a clean Catalina install, and an iPhone that has never been connected to that Mac, creating a simulation of a stolen, locked iPhone. He then installs an iOS update on that iPhone, all done without entering a passcode.
That said:
To be clear, my iPhone still prompted for its passcode when the update had finished its installation process. This did not magically unlock my iPhone. It also doesn’t prove that passcode preferences could be changed without first entering the existing valid passcode.
But it did prove the existence of one channel where an iPhone could be forced to update to a compromised version of iOS. One that would be catastrophic in its implications for iPhones today, into the future, and for encrypted data in its entirety. It is possible; it is terrible.
Does Nick’s experiment show a weakness in the process? Could a compromised iOS update be added which disables the passcode?
Certainly interesting. Taking this with a grain of salt, at least until someone follows this all the way through and unlocks an iPhone using this approach. Which I hope never happens.