This paper describes methods for three different sensor driven guidance systems that have been developed, implemented, and tested. They represent advanced capabilities for utilizing onboard sensors, in this case vision systems, to drive guidance policies to steer an aircraft. The first includes utilizing an onboard camera image to enable a helicopter to automatically follow a moving ground vehicle. The second involves enabling an aircraft to automatically follow another aircraft. The third is to enable a helicopter to precisely place a ground robot in a window. In all three cases there is no direct communication between the target and the guidance system. In all cases all information used to guide the aircraft is obtained from onboard sensors. Common lessons are discussed from these very different applications of sensor-driven guidance systems for unmanned vehicles.