The sophisticated head tracking system like the one that is integrated into the iPhone X may have been designed for AR and security purposes, but it can also be very useful for people with disabilities. A proof-of-concept application from an eBay fellow shows how someone with very little motor function can navigate the site with no more movement than the head.
Muratcan Çiçek is one such person and relies on assistive technology every day to read, work and move around. This year he did an internship on eBay and decided to create a tool that would help people with motor disabilities like him to buy online. It turns out that there are many general-purpose tools for accessibility, such as letting a user control a cursor with their eyes or a joystick, but nothing done just to navigate on a site like eBay or Amazon.
Its creation, HeadGaze, is based on the set of front sensors of the iPhone X (through ARKit) to track the movements of the user's head. The different movements correspond to different actions in a demo application that shows the daily offers of the online retailer: browse categories and products by tilting the head in all directions, or tilt it in half to buy, save or share.
You can Watch it in action in the short video below:
Not that this is a big revolution in the interface: there are some applications and services that do this, although maybe not in such a simple way. and an extensible form like this.
But it's easy to underestimate the cognitive load created when someone has to navigate a user interface that is designed around the senses or limbs that they do not have. Creating something like this is not necessarily simple, but it is useful and relatively simple, and the benefits for a person like Çiçek are substantial.
That's probably why he made the HeadGaze project open source; you can get all the code and documentation on GitHub; everything is in Swift and currently only works on the iPhone X, but it's a start.
Considering that this was an intern's summer project, there are not many excuses for companies with thousands of developers to not have something like that available for their applications or shop windows. And it's not like you can not think of other ways to use it. As Çiçek writes:
HeadGaze allows you to move and interact on your phone with only subtle head movements. Think of all the ways this could come to life. Are you tired of trying to scroll through a recipe on the screen of your phone with greasy fingers while cooking? Is it too complicated to follow the instruction manual on your cell phone while fiddling with the car's engine under the hood? Is it too cold to remove your gloves to use your phone?
He and his colleagues are also looking for real eye tracking to increase head movements, but that is still far away. Maybe you can help.