When you are in a face-to-face conversation, the person on the other side is constantly reading and interpreting your facial expressions, looking for feedback. If you offer a puzzled look, they might rephrase their words, without you having to ask.
Ever since Darwin, scientists have systematically analyzed facial expressions, finding that many of them are universal. Humans are remarkably consistent in the way their noses wrinkle, say, or their eyebrows move as they experience certain emotions. People can be trained to note tiny changes in facial muscles, learning to distinguish common expressions by studying photographs and video. Now computers can be programmed to make those distinctions, too.
Pretty interesting stuff. This seems like a natural evolution to existing facial recognition algorithms. Seems to me, the first company that comes up with a consistent solution to this problem will be ripe for acquisition.
There are privacy concerns, of course. Reading facial expressions is clearly a form of surveillance. And constantly reading my expression feels awfully close to reading my mind. Lots of unwanted thoughts flit across my face. But if it’s just me and my computer, I think I’m OK with that.