Last week, we shared this example of image tracking using ARKit 2. Here’s another one:
Image tracking in #ARKit2. So many potential applications! pic.twitter.com/tSrpupjIXU
— Nathan Gitter (@nathangitter) June 17, 2018
Wonderful. I get that, perhaps, our AR future will be seen through glasses. But examples like these are useful even seen through the lens of your iPhone. To me, a relatively short AR transaction works just fine on an iPhone. And I do agree that a more immersive experience will require glasses or (way in the future) connected contact lenses.