In the last decade there has been an exponential growth in the technology and application of unmanned air vehicles (UAVs) or "drones". Once simple radio-controlled aircraft, modern UAVs are able to operate autonomously, in all conditions, and carry significant payloads. Civilian applications include drones for package delivery, inter- and intra- campus transport, eVTOL urban air mobility (UAM) for materials and people. In the defense sector, the US military has long used UAVs for ISR and for tactical operations. UAV and drone technology are now available to our advisories, having been used to target oil plants and military bases and operations. For commercial and non-hostile applications, tracking air vehicles, atmospheric conditions, and weather is critical to the widespread adoptions of manned and unmanned air vehicles (M/UAVs). However, neither transponders and beacons integrated in M/UAVs, nor implementation of a radar-based air traffic control (ATR) are practical due to spectrum regulation, development times, weight and cost. For non-cooperative and hostile applications, the lack of terrestrial based detection and tracking is a critical security gap; there is a requirement for "drone sensors" that can detect, classify, and locate a hostile drone to protect military operations. Currently there is no available passive M/UAV or drone sensor for either civilian or military applications. To address these critical gaps, 0BD and WRC are developing technology that exploits ubiquitous digital waveform broadcast (FM, TV), cellular network (4G, 5G), and other wireless signals as illumination sources of opportunity. The illumination sources are reflected passively off M/UAVs or phase shifted by weather and atmosphere. By collecting and analyzing these reflected and phase shifted signals, we have shown in Phase 1 and previous work, the ability to detect, classify, and locate M/UAVs and measure changes in atmospheric and meteorological conditions. We propose in this Phase 2 STTR to develop, prototype, and fabricate RF receivers and integrated signal processing to capture the opportunistic signals reflected by M/UAVs and use Deep Convolutional Neural Networks (DCNNs) that employ Transfer Learning (TL) to autonomous detect, classify, and track M/UAVs, outputting data that can be shared with other command and control systems.