Search This Blog

Tuesday, November 3, 2015

Watch: MIT Drone Autonomously Avoids Obstacles at 30 MPH

As reported by Robotics TrendsObstacle avoidance needs to be the next big thing for drones. As 3D Robotics founder Chris Anderson said, the “mass jackassery” (reckless flying) needs to stop. It’s part of the reason we now have a mandatory drone registration system looming over us.


DJI has been on the forefront of avoidance technology for drones, recently introducing its Guidance system that uses multiple stereo and ultrasonic sensors that allows the drone to automatically avoid obstacles within 65 feet.
Andrew Barry, a PhD student at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL), is looking to push this technology to the next level. Barry and professor Russ Tedrake have created an obstacle-detection system that allows a drone to autonomously avoid obstacles in its flight path while flying 30 miles per hour.
CSAIL posted a fascinating video, which you can watch below, of Barry’s system helping a drone “dip, dart and dive” through a tree-filled field.

The drone in the video, which was made with off-the-shelf components for $1,700, weighs just over a pound and has a 34-inch wingspan. It has a camera on each wing and two processors that are “no fancier than the ones you’d find on a cellphone.”
CSAIL says Barry’s software runs 20 times faster than existing obstacle detection software. Operating at 120 frames per second, the open-source software allows the drone to detect objects and map its environment in real time, extracting depth information at 8.3 milliseconds per frame.
“Sensors like lidar are too heavy to put on small aircraft, and creating maps of the environment in advance isn’t practical,” Barry says. “If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms.”
So, how does it work? We’ll let CSAIL explain:
Traditional algorithms focused on this problem would use the images captured by each camera, and search through the depth-field at multiple distances - 1 meter, 2 meters, 3 meters, and so on - to determine if an object is in the drone’s path.
Such approaches, however, are computationally intensive, meaning that the drone cannot fly any faster than 5 or 6 miles per hour without specialized processing hardware.
Barry’s realization was that, at the fast speeds that his drone could travel, the world simply does not change much between frames. Because of that, he could get away with computing just a small subset of measurements - specifically, distances of 10 meters away.
“You don’t have to know about anything that’s closer or further than that,” Barry says. “As you fly, you push that 10-meter horizon forward, and, as long as your first 10 meters are clear, you can build a full map of the world around you.”
While such a method might seem limiting, the software can quickly recover the missing depth information by integrating results from the drone’s odometry and previous distances.
Barry wrote about the system in his paper “Pushbroom Stereo for High-Speed Navigation in Cluttered Environments” (PDF) and says he needs to improve the software so it can work at more than one depth and dense environments.

“Our current approach results in occasional incorrect estimates known as ‘drift,’” he says. “As hardware advances allow for more complex computation, we will be able to search at multiple depths and therefore check and correct our estimates. This lets us make our algorithms more aggressive, even in environments with larger numbers of obstacles.”

No comments:

Post a Comment