This paper describes recent advances in autonomous visual landing of aircraft in three operational scenarios. The first suggests a robust visual target and an algorithm that tracks the suggested target. The second explores the case when we can not use a prepared visual target and have to land on an arbitrary target. Both the first and second methods are evaluated with a mobile target. The third addresses the problem of landing on an unprepared static target in GPS-denied environments. A key thread throughout all approaches is the estimation of not only of system states, but also of error covariance of the target and vehicle. The error covariance then may be used to determine the status of the estimation during the approach and to engage a contingency maneuver if necessary. The approaches are validated in high-fidelity simulation and in flight testing. Landing pad tracking is shown to be accurate and robust to viewpoint and distance. GPS-denied landing is found to have low error and be robust to landing zone appearance.
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering