Last year we saw the meteoric rise of Pokémon GO, the location-based augmented reality game that put AR in the hands of the mainstream public, via a free app on their mobile (devices). Everywhere you looked there would be swarms of people navigating the streets, parks and bridleways; clutching their phones, trying to find and ‘catch’ Pokémon in the real-world environment around them.
It was a massive hit with people of all ages and was one of the most downloaded (650+ million worldwide) and profitable apps of 2016, due to in-app purchases and in-app marketing revenue.
Additionally, the rise and popularity of AR camera effects in apps from Facebook and Snapchat have meant even more people can enjoy and experience AR without having to shell out on expensive headsets or glasses.
Mark Zuckerberg recently told Tech Crunch that “the first augmented reality platform that becomes mainstream isn’t going to be glasses, it’s going to be cameras.” Facebook had already opened up their Camera Effects Platform to developers but this month we saw Apple follow suit, by announcing at the WWDC17 that their ARKit would be bundled with iOS 11, encouraging further innovation in this area.
So it seems clear that AR and the way it's delivered through mobile camera technology will continue to develop hand in hand.
This new era will require some evolution of the way we approach UX for AR applications of the future. UX Designers need to think outside of the box, in fact there is no box - the world is our new canvas and the opportunities to enhance our day to day lives and tasks are infinite.
Traditional user actions may be moot - physical interactions, taps on screens and buttons might become a thing of the past as we move more towards gestures, micro-gestures and voice commands. Just look at the rise of the voice activated smart speakers such as Google home, Amazon Echo and soon to be released (Apple) HomePod - all of which can fulfill the requests of their masters voice.
Imagine a scenario where a driver parks their car on a busy high street where parking charges apply. An AR app could help by simply pointing their camera at a marker on the space; parking times, charges and 'where to find the nearest meter' could be revealed. Better still, the user could be presented with options to purchase set parking times (1hr, 2hr, 3hr) direct from their phone.
"Pay for 2 hours please" - Done!
With the potential of so much happening around users in this AR future world, interfaces will need to be contextual, intuitive and efficient; requiring less user effort to achieve their goals.
Reaching out and waving your hands to scroll through items like Tom Cruise in Minority Report might not be possible in a crowded space. Imagine attempting that on the London underground! So micro-gestures may become the new means to control or action user requests. A finger flutter, blink, tilt of the head or a brief facial expression could be all that is required.
The potential of AR
AR is fun and most mainstream implementations have focused on this, enabling users to play games or adorn their faces with animal features and fun accessories. However this technology is still in its infancy and its wider application and potential is enormous for a variety of industries.
Manufacturing, construction, education, sales and marketing, design and financial are just a handful of sectors already using or looking into how AR can enrich their businesses or offer a better customer experience.
With no established best practices or hardened rules to confine our creativity, it's an exciting time to be a UX Designer.