This paper proposes a quasi-polar local (turn rate-time) occupancy grid approach for obstacle avoidance. It uses GPS and inertial navigation combined with a vision system to map sensor data directly onto dynamically feasible paths, so that path planning consists simply of selecting the path with lowest likelihood of collision. A numerical method for motion updates that can cope with the differing sizes and shapes of each cell in the occupancy grid is proposed, and a probability-based inverse sensor model that maps range and bearing-based sensor data to this path-based occupancy grid is developed. Three exteroceptive sensor models (wide-field monocular vision, pushbroom stereo, and pushbroom stereo combined with wide field monocular) are presented in this context. Simulations of flight through a two dimensional environment consisting of both forest and urban terrain are used to demonstrate the utility of this approach.