The Resilient Micro Flyer: a New Collision-tolerant Autonomous Robot
Loading...
Authors
De Petris, Paolo
Issue Date
2020
Type
Thesis
Language
Keywords
Alternative Title
Abstract
This work presents the design, development and autonomous navigation of the Resilient Micro Flyer, a new type of collision-tolerant small aerial robot tailored to traversing and searching within highly confined environments including manhole-sized tubes. The robot is particularly lightweight and agile, while it implements a collision-tolerant design which renders it resilient during forcible interaction with the environment. The overall rigid design of the system is enhanced through elastic passive flaps ensuring smoother and more compliant collision which was identified to be especially useful in very confined settings. Focusing on autonomous operations in narrow environments, the presented research realizes four key functionalities, namely a) visual-inertial odometry for pose estimation, b) a policy for sufficient clearance from nearby objects relying purely on four ultra-lightweight time-of-flight sensors, c) the detection of collisions with the environment as an anomaly on the inertial measurement data, and d) the direct mechanical feedback-based self-centering and smooth traversal of very tight spaces based on its elastic flaps. The final system prototype is capable of 14 of endurance and weighs less than 500g. A comprehensive experimental study is presented in which the robot is tasked to navigate through two rooms connected via a manhole-sized (width x height equal to 0.5 x 0.4 m) tube with a length of 2.5 m.In addition, this thesis outlines a specific contribution in LiDAR-data based staircase detection which relates to the goal of deploying such collision-tolerant robots in multi-level underground environments. This contribution took place in the framework of the DARPA Subterranean Challenge, where an additional type of collision-tolerant platform is utilized by our team. The key idea behind the method is to project into a 2D bird-eye view a series of N pointclouds stitched together based on the odometry of the robot. On this newly generated image we then find and isolate the best set of parallel lines based on length, orientation and relative distance between each other. The algorithm is able to output the average estimated pose of the center of the stair and the bounding box around it.