Eye-Tracking Glasses to Control UAS

Engineers at New York University, the University of Pennsylvania, and the U.S. Army Research Laboratory have developed a system for controlling UAS using eye movement.

All that is required is a pair of eye-tracking glasses (Tobii Pro Glasses 2), a small computational unit, and a drone. The glasses contain an inertial measurement unit (IMU), which a smart deep neural network uses the data from to determine how far away the drone is and where the wearer is looking.

In this work, we address the problem of providing human-assisted quadrotor navigation using a set of eye tracking glasses.The advent of these devices (i.e., eye tracking glasses, virtual reality tools, etc.) provides the opportunity to create new, non-invasive forms of interactions between an human and robots.

We show how a set of glasses equipped with gaze tracker, a camera, and an Inertial Measurement Unit (IMU) can be used to (a) estimate the relative position of the human with respect to a quadrotor, (b) decouple the gaze direction from the head orientation, and (c) allow the human spatially task (i.e., send new 3D navigation waypoints to) the robot in an uninstrumented environment.

We employ a combination of camera and IMU data to track the human’s head orientation, which allows us to decouple the gaze direction from the head motion. In order to detect the flying robot, we train and use a deep neural network. We evaluate the proposed approach experimentally, and show that our pipeline is able to successfully achieve human-guided autonomy for spatial tasking. The proposed approach can be employed in a wide range of scenarios including inspection, first response, and it can be used by people with disabilities that affect their mobility.

Source: YouTube

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *