Saliency-driven Robotic Perception and Odometry Estimation

Loading...
Thumbnail Image

Authors

Tsiourva, Maria

Issue Date

2020

Type

Thesis

Language

Keywords

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Deployment of autonomous mobile robots in GPS-denied environments or even subject to conditions of poor illumination and texture requires reliable estimation of the robots' position, as well as fast detection of objects in their surroundings in order to facilitate robust navigation. As such functionality is challenging to be achieved given the limitations on on-board computing power and sensing informativeness, we need more resilient frameworks that possibly exploit a collection of different modalities and utilize intelligent methods to sample and process information. Taking inspiration from studies of the human visual attention system, alongside research in visual odometry estimation, in this work we propose a refreshed approach on the problems of odometry estimation and attentive perception for robotic systems. More specifically, we propose new algorithms for a) exploiting salient visual cues not only to detect those objects standing out in the environment but also to guide feature selection for robot localization, and b) extending the principles of attentive vision to include multiple sensing modalities tailored to conditions of visual degradation such as infrared cameras, as well as Light Detection and Ranging (LiDAR) systems. We verify the proposed contributions through a collection of field experiments including cases of tests inside underground mines.

Description

Citation

Publisher

License

Journal

Volume

Issue

PubMed ID

DOI

ISSN

EISSN