Mathematics and methods of integrating camera measurements with inertial sensors for terrain relative navigation of a space vehicle are discussed. Pinhole camera model of the vision sensors, in conjunction with measurement models of typical inertial sensors are used to derive a position and attitude fix for the navigation state of the space vehicle. An ancillary frame initialization process that exploits the three dimensional translational motion geometry of the space vehicle to derive uncertain estimates of the feature locations is derived. Linear covariance analysis is carried out to derive the conditional state uncertainties of the feature locations that are utilized by the filter in a second pass. Approaches for state estimation are tested using data obtained from a high-fidelity rendering engine developed by the team. Experimental data obtained from a medium-fidelity terrain relative navigation emulation test-bed called Navigation, Estimation, and Sensing Testbed (NEST) is utilized to demonstrate the utility of the filter formulations developed here-in.