At the University of Pennsylvania’s Vijay Kumar Lab, Dr. Kumar and his students have been developing advanced, autonomous drones that are assembled from off-the-shelf parts. The drone itself is bare bones: a frame, battery, four motors and props, and a motor controller. As for the phone they’re using, it’s not a special prototype or a developer-only device like Google’s Project Tango. It’s an ordinary Samsung Galaxy phone.
The only real difference is that Kumar’s team has installed a special app they’ve created in cooperation with Qualcomm. The app makes use of the phone’s built-in cameras and other sensors to determine its own flight path. Images from the camera can be analyzed at a very rapid pace — around 200 times every second — and that information is converted into control signals that are pushed to the drone’s four motors.
This is a good example of just how far smartphones have come: they’re certainly powerful computers, but it’s the integrated sensing that comes standard in almost all of them (things like gyros, accelerometers, IMUs, and high resolution cameras) that makes them ideal for low-cost brains for robots. What’s unique about this demo is that it’s the first time that a sophisticated platform like this (vision-based real-time autonomous navigation of a flying robot is pretty darn sophisticated) has been controlled by a very basic consumer device.
So, what’s next? Vijay Kumar tells us where they’re headed:
“What we’d like to do is make these kinds of robots smaller, smarter, and faster. When you make things smaller, the number of things you can do increases, and that’s where we hope to use lots of these guys. So think about a single flying phone that you have today; tomorrow, you’ll see a swarm of flying phones. That’s what we’re working towards.”