Deep Learning-Based Exploration Path Planning

Loading...
Thumbnail Image

Authors

Reinhart, Russell E

Issue Date

2020

Type

Thesis

Language

Keywords

Aerial Robotics , Autonomous Systems , Deep Learning , Path Planning

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

In this thesis, two deep learning-based path planning methods for autonomous exploration of subterranean environments using aerial robots are presented. One approach utilizes imitation learning, where training samples are generated by a sampling-based state of the art exploration path planner, to construct a model which proposes comparable trajectories to those of the expert planner in many underground tunnel environments. This imitation learning based method uses a small window of recent LiDAR measurements to infer trajectories at a fraction of the computational cost of the expert training planner while also removing the requirement for an online map reconstruction of the environment. The second proposed approach utilizes a deep reinforcement learning algorithm applicable to continuous state and action spaces and partially observed Markov decision processes; the reward for the agent is contingent upon the agent's efficient exploration of the environment. The proposed methods are evaluated in simulated and real-world environments.

Description

Citation

Publisher

License

Creative Commons Attribution 4.0 United States

Journal

Volume

Issue

PubMed ID

DOI

ISSN

EISSN