MIT uses Kinect for Autonomous Flight and Mapping

Students at MIT have developed a real-time visual odometry system that can use a Kinect to provide fast and accurate estimates of a vehicle’s 3D trajectory.

This system is based on recent advances in visual odometry research, and combines a number of ideas from the state-of-the-art algorithms. It aligns successive camera frames by matching features across images, and uses the Kinect-derived depth estimates to determine the camera’s motion. Odometry is the process of using data from some kind of sensor to figure out position.

It means that the drone can fly in GPS-denied environments. It does not require a motion capture system or other external sensors – and all sensing and computation required for local position control is done onboard the vehicle.

“In environments where GPS is noisy and maps are unavailable, such as indoors or in dense urban environments, a UAV runs the risk of becoming lost, operating in high threat regions, or colliding with obstacles,” MIT’s Robust Robotic Group writes on its web-site

The MIT team collaborated with Peter Henry and Mike Krainin from the robotics and state estimation lab at the University of Washington to use their RGBD-SLAM algorithms. Both the US Office of Naval Research and the Army Research Office have sponsored the project 

Sources: Tech Eye, Wired.Co.UK

Leave a Reply

Your email address will not be published. Required fields are marked *