Ben Lovejoy, 9to5Mac:
The American Civil Liberties Union (ACLU) has raised privacy concerns about developer access to the facial expressions of iPhone X users. In particular, they say that Apple allows developers to capture facial expression data and store it on their own servers.
When the iPhone X was launched, Apple was careful to stress that the 3D face recognition model used by Face ID was stored only on the phone itself. The data is never transferred to Apple servers. But the ACLU says that app developers are allowed to transmit and store some face data.
Interesting article. Lots of layers to this issue. There’s face tracking (think Animoji) and attention detection (are you actually watching your screen). How much of this data is hidden behind an API? In other words, does Apple simply tell a developer whether you are paying attention to the screen, or do they give you more specific data, like the current screen location on which you are currently focused?
This is a good read. And keep an eye out for more detail in the Rene Ritchie/iMore iPhone X review I’ll be posting a bit later this morning.
Update: There is no API for attention detection, therefore there is no way for developers to access it.