Vision sensor fusion for autonomous landing

Takuma Nakamura, Stephen Haviland, Dmitry Bershadsky, Eric N. Johnson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper describes a vision-based algorithm for autonomous landing on a moving target. The algorithm fuses multiple outputs of two different computer vision techniques. One is the Viola-Jones object detection using Haar-like features, and the other is the AprilTag detection that segments an image based on local gradients. The Haar-like feature detector can detect any arbitrary known features, and we use this method when an aircraft is at altitude and approaches a landing spot. The AprilTag, which allows for precise position and attitude determination of the target, is placed at an expected landing location and used for a final approach. The combination of those techniques allows us to track the target through all the landing phases from altitude to touch down. We fuse the outputs by utilizing the statistics of the measurements and multiple extended Kalman filters. This way, we can not only probabilistically choose the right target from multiple candidates but also estimate the velocity of the target for formation flight and landing. This algorithm is demonstrated in an image-in-the-loop simulation and flight tests with a Yamaha RMAX helicopter and a WAM-V boat.

Original languageEnglish (US)
Title of host publicationAIAA Information Systems-AIAA Infotech at Aerospace, 2017
PublisherAmerican Institute of Aeronautics and Astronautics Inc, AIAA
ISBN (Print)9781624104497
StatePublished - Jan 1 2017
EventAIAA Information Systems-Infotech At Aerospace Conference, 2017 - Grapevine, United States
Duration: Jan 9 2017Jan 13 2017

Publication series

NameAIAA Information Systems-AIAA Infotech at Aerospace, 2017

Other

OtherAIAA Information Systems-Infotech At Aerospace Conference, 2017
CountryUnited States
CityGrapevine
Period1/9/171/13/17

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering
  • Industrial and Manufacturing Engineering

Fingerprint Dive into the research topics of 'Vision sensor fusion for autonomous landing'. Together they form a unique fingerprint.

  • Cite this

    Nakamura, T., Haviland, S., Bershadsky, D., & Johnson, E. N. (2017). Vision sensor fusion for autonomous landing. In AIAA Information Systems-AIAA Infotech at Aerospace, 2017 (AIAA Information Systems-AIAA Infotech at Aerospace, 2017). American Institute of Aeronautics and Astronautics Inc, AIAA.