Google Glass App to Make Wheelchairs Eye-Controlled

What would you do with Google Glass? The company asked the public that question, with the hashtag #ifihadglass, to help find testers for its “explorer” programme.

Though those applications are closed, one local engineer from Wayfair posted on Google+ last month a fascinating idea with potential social impact: use it to allow physically disabled individuals to control their wheelchairs using only their eyes.

Wrote Wayfair’s Steve McHugh:

#ifihadglass I would use the eye tracking technology described in US patent 8,235,529 to implement a Masters robotics lab project’s alternative mechanism for quadriplegic and other disabled persons to control their powered wheelchairs (start/stop, speed, turning) while displaying real-time feedback about their surroundings (dangers, obstacles, suggested routes).

Since first posting the idea, McHugh has worked with a UI designer to refine it. Though many of the details are speculative, subject to unconfirmed details of how Glass operates, McHugh has clearly thought through exactly how this would work. I asked him over email to explain more about the origin of the idea and his proposed implementation.

How did you come up with this idea?

The idea for my project came from a robotics class I took while completing my Masters in Mechanical Engineering at Tufts University. For a class project, one of the groups built an eye-tracking software and an eye-tracking apparatus. They did this by using a webcam to control the motors of a small robotic car with the intention of applying this technology to a wheelchair.  Later on, Tim Roberts, the group lead was able to pair his technology with an electric wheelchair as part of his Senior Capstone.

When the co-founder and chairman of my company, Wayfair, encouraged all of us to submit ideas for Google’s Glass Explorer contest, the first thought that popped into my head was how the technology built into Google Glass could bring that project to life in a completely different way.  I quickly verified if I was right in my assumption that Google would build eye-tracking into their device and found US patent 8,235,529 which confirmed that assumption.  A professionally developed eye-tracking package provided by Google eliminated one of the two largest obstacles I saw to Tim’s project becoming a reality. My project will be able to utilize a highly optimized and professional eye-tracking library.  The second major obstacle to the original design is the visual obstruction of the eye-tracking apparatus. Google Glass not only removes the need for an obstruction as in the original concept, but it also allows for adding in feedback to the user about their surroundings. Google Glass will provide a fantastic interface for the wheelchair controls by providing GUI overlays in real-time on top of the glass.

Can you explain a bit about the eye tracking technology and how you see it being applied in this context?

While I won’t know more about the specifics of the eye-tracking technology until I gain access to the Glass API, the patent Google issued (US 8,235,529) indicates there will be rather precise tracking of the right eye’s movement relative to objects presented in the user’s line of sight.  This level of precision will make it possible to seamlessly use subtle eye motions to select controls that change the speed and direction of the chair when the user stares at a particular button for a determined period of time.  What constitutes a significant period of time will need to be determined through some testing, but it will be more than the amount of time someone might absently glance past a button and less than an amount of time that degrades the ability to seamlessly switch from one direction to another or from speeding up to slowing down. The Google Glass human interface guidelines will most likely provide guidance on timing. Being able to rely on subtle eye motions for button-like control also allows us to track for more extreme eye motions, as well as shutting the eye all together for a significant period of time, to programme in system overrides such as an emergency stop.

What about the image you have of what the visual UI would look like. Can you talk me through what I’m looking at and what the functionality is?

The UI image attached to my contest application is a very rough concept about how a user would interact with the controls.  What you see in the upper right is the card used to control the speed of the wheelchair.  I have worked with a UI designer since then to iterate on the process further and sketch out what the UI may look like. So far it is still in conceptual stages as we’re missing the Glass GUI Library which will likely be part of the developers kit. Once that is released the buttons you’re seeing will likely be replaced with standard interface elements. The basic functionality on the image below are the buttons at the top which allow you to increase, decrease the speed of the wheelchair or to stop it altogether. The bottom right is a status bar letting the rider know their current speed and whether it is accelerating, decelerating or staying constant. The bottom left is a battery indicator for the chair itself and provides an estimate of how much battery is left in the chair. The last element I added is the obstacle and path detection. The machine would be able to detect potential obstacles in the path such as potholes or objects in the way and alert the user.

Anything else you’d like to add?

I am really excited about this project. My loftiest dream is that Google would somehow pick me out of thousands of submissions and that I would ultimately be able to turn this into a real product by sometime next year. I think medical applications of Google Glass are huge and even at the rumored $1500 Google Glass would be potentially life-changing for quadriplegics and even paraplegics, freeing their hands to do other tasks. I really hope that this project becomes a reality to help anyone who could benefit from supplemental control interfaces.

Source: BostInno

Leave a Reply

Your email address will not be published. Required fields are marked *