As unmanned aircraft systems (UAS) become more prevalent there is an increasing desire to automate UAS navigation and control. To enable future UASs to perform a wider variety of missions, the they must be able to complete autonomous relative navigation to accomplish missions. Current technologies rely heavily on GPS measurements, which are undesirable since GPS signals may be unavailable in many DoD applications. Active sensing technologies are also undesirable due to increased SWAP and the desire to limit emitting/communications to enable covert operations. Therefore, a system is needed to provide visual relative navigation (VRN). Toyon and CalTech propose developing a Visual Relative Navigation via Intelligent Ephemeral Relationships (VRNIER) system that consists of passive optical sensors and the processing needed to process the images into accurate six degree of freedom relative estimates. The approach combines computer vision and deep learning for estimating relative platform relationships with online integrity modeling to enable automated VRN. All of the key algorithms will be developed and demonstrated with simulated and surrogate data in Phase I to create a low risk path for demonstrating a prototype system in Phase II. The work leverages Toyons extensive history in automated image processing for navigation and other applications.