The UX of a blink

Imagine a world where handsfree AR is possible. Now ask yourself how are you going to be able to navigate such experiences? The answer perhaps lies in the blink of an eye.

Today’s Augmented Reality (AR) and Virtual Reality (VR) experiences have an established UX. If you step back and look at the big picture, regardless of headsets and controllers, most experiences are similar. Sure the visual quality can be quite different from an Oculus to a Vive, but the way you navigate through a VR experience or a game all follows similar patterns. While the tech is new, its surprising how fast we are standardising it. Just google VR styleguides or UX for guidelines for VR and see the available documentation. You will find them for all the popular manufacturers of headsets and companies selling 3D engines / software.

While most VR experiences are experienced at a standstill, some do use a teleport like system to move users around virtual worlds. Yet this is only possible through physically interacting with a controller that usually has a joystick, trigger and additional buttons. A person is able to use such a controller to point and click and navigate. This is not the case with AR experiences.

An example of navigating menus in VR in Half Life Alyx.
An example of movement in Half Life Alyx

AR experiences are usually more lightweight. Most of the AR experiences adapt a HUD like system similar to what you might see in say a fighter aircraft. A visual overlay juxtaposed over the reality you see in front of you. While there are AR experiences for your mobile, the most cutting edge ones are usually in glasses that resemble thick framed dark glasses. Depending on the device you consume it on, navigation today is achieved through taps on the side arm of the glass, touching & swiping on mobile or in some cases where you wear certain VR headsets, hand gestures.

But what if we could do away with all that?

Can you control an Augmented Reality experience with just your eyes?

Assuming we had a system that could track eye movements to precise millimetre movements we could achieve basic scrolls and pans by tracking gaze movement. Yet selecting things on a super imposed interface is the key challenge. In a traditional computing environment we have been well trained (through years of standardisation) to anticipate what the expected behaviour of a left click and right click on a mouse is. Could we achieve the same through a navigation system created by blinks? How about this –

Double blink – Select / Click

Left blink – Simulate a hold

Right blink – Undo

An eyeroll – Restart

Then we could try closing one eye as a modifyer, though that would not only look weird, but also cause some real fatigue. Navigation in such a system will not be anything like a traditional computer menu that you are accustomed to. So don’t visualise trying all those clicks on a typical browser or a complex software like Photoshop. There will have to be adequate spacing of menu elements to ensure no false triggers for a start. Contrast will be a very present issue. The perceived depth of the interface is also very important. I imagine a lot of trigonometry will go into this.

On the plus side, all that blinking will ensure your eyes are nice and lubricated. Dry eyes, the bane of the computer age, might just become a thing of the past in such an AR future. There are issues like the asymmetry of blinking to contend with. Like having a dominant hand, you will have a dominant eyelid too. Though people with loss of vision in one eye and a host of other ocular problems will added multiple levels of accessibility complexities to this tech.

Luckily, we have one little trick up our sleeves that will eliminate the need for all this eye watering UX. Voice recognition. A built in voice assistant can take over some of the heavy lifting of navigating such a user experience. Predictive inputs to the next level. Combined, I truly believe we will have a plausible hands-free AR experience.

Though, if you thought people talking to themselves as they walked down the street talking into their bluetooth in-ear headphones looked odd, this tech is going to look just as peculiar. Not as peculiar as those folk who tried wearing the Google Glass though.