Introduction Nakamoto Terminal (NTerminal) is a modular and flexible data aggregation and analysis platform. It has a commercial-off-the-shelf (COTS) web-based platform that offers data monitoring and surveillance tools spanning the entire financial ecosystem. This includes data and analysis for blockchain technology, persons of interest, dark web transaction data, and cryptoasset infrastructure. Content Delivery Chain Raw data enters NTerminal via source grabbers. Our content delivery chain routes that data to other components. Those components can be low latency data consumers (sending the data direct), a series of processors, or our platform implemented on Splunk. Source Grabbers NTerminal grabbers collect data from a multitude of sources in all three of our data segments: market data, technical data, and natural language data. How we grab the data depends on the type of data being collected. NTerminal aggregates data from open APIs, data sharing licenses with specific groups. Processors NTerminalâs processors extract intelligence from our data and create new analysis streams that are pushed back into our content deliver chain, and ultimately delivered to our customers. NTerminalâs Natural Language Processor (NLP) and Machine Learning Processor (MLP) are two examples of processors that we would use to fulfill the requirements under the RFP. Both are described here. Natural Language Processor NLP processes natural language data from text such as Reddit, Slack, Tweets, news articles, blogs, or documents from business and financial regulators. It normalizes the data format, extracts named entities, analyzes media files, and retrieves text sentiment. Based on the presence of keywords, such as currency symbols or the names of individuals, it isolates relevant pieces of information, and generates analytical reports. Machine Learning Processor MLP is implemented to help clients uncover insights and trends through the data we provide. Historical events are utilized for the development of training sets for algorithms to learn patterned based processing. Once trained, machine learning modules facilitate data transformations between content storage, agents, and network tools to synthesize input towards a functional form. Data transformations work to simplify events in relation to previous data or outside agent activity, effectively contextualizing the data for simulating and learning pattern based network behavi