Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle

Daniel Magree, Eric Johnson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Scopus citations

Abstract

As unmanned aerial vehicles are used in more environments, flexible navigation strategies are required to ensure safe and reliable operation. Operation in the presence of degraded or denied GPS signal is critical in many environments, particularly indoors, in urban canyons, and hostile areas. Two techniques, laser-based simultaneous localization and mapping (SLAM) and monocular visual SLAM, in conjunction with inertial navigation, have attracted considerable attention in the research community. This paper presents an integrated navigation system combining both visual SLAM and laser SLAM with an EKF-based inertial navigation system. The monocular visual SLAM system has fully correlated vehicle and feature states. The laser SLAM system is based on a Monte Carlo scan-to-map matching, and leverages the visual data to reduce ambiguities in the pose matching. The system is validated in full 6 degree of freedom simulation, and in flight test. A key feature of the work is that the system is validated with a controller in the navigation loop.

Original languageEnglish (US)
Title of host publication2014 American Control Conference, ACC 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1900-1905
Number of pages6
ISBN (Print)9781479932726
DOIs
StatePublished - Jan 1 2014
Event2014 American Control Conference, ACC 2014 - Portland, OR, United States
Duration: Jun 4 2014Jun 6 2014

Publication series

NameProceedings of the American Control Conference
ISSN (Print)0743-1619

Other

Other2014 American Control Conference, ACC 2014
Country/TerritoryUnited States
CityPortland, OR
Period6/4/146/6/14

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle'. Together they form a unique fingerprint.

Cite this