Google just raised the bar for Apple’s Face ID

Google blog:

Pixel 4 will be the first device with Soli, powering our new Motion Sense features to allow you to skip songs, snooze alarms, and silence phone calls, just by waving your hand. These capabilities are just the start, and just as Pixels get better over time, Motion Sense will evolve as well. Motion Sense will be available in select Pixel countries.

And:

Unlocking your phone should be easy, fast, and secure. Your device should be able to recognize you—and only you—without any fuss. Face unlock may be a familiar feature for smartphones, but we’re engineering it differently.

Differently? How?

Other phones require you to lift the device all the way up, pose in a certain way, wait for it to unlock, and then swipe to get to the homescreen. Pixel 4 does all of that in a much more streamlined way. As you reach for Pixel 4, Soli proactively turns on the face unlock sensors, recognizing that you may want to unlock your phone.

If the face unlock sensors and algorithms recognize you, the phone will open as you pick it up, all in one motion. Better yet, face unlock works in almost any orientation—even if you’re holding it upside down—and you can use it for secure payments and app authentication too.

Assuming this tech works as advertised, Google just raised the bar for Face ID. As is, I often have to shift my iPhone, tweaking the angle to my face, in order to get Face ID to kick in. This is no big deal, but it does throw a delay in there. I almost never have to enter my passcode, but I often have to play a bit for Face ID to kick in.

And though I can get Face ID to kick in with my iPhone a bit off to the side, it never works when sitting flat on my desk or when upside down.

The advantage to Google’s announced approach is that it supports wider angles and orientations, and also starts the recognition process when you reach for your phone, not waiting for a tap on the screen. A subtle point, but a natural next evolution.

Is Apple working on this? I suspect they already have such experiments in the lab, but only release what works really well on all Face ID phones, including the iPhone X. Being able to detect gestures, such as a hand reaching for your phone, no doubt requires some specialized software and powerful machine learning processors. Seems like this should be doable for Apple, given the power of the onboard machine learning hardware already in your iPhone and iPad.

Read the linked Google blog. Interesting stuff.