Previous versions of OPC used a Random Forest to perform the data fusion. In this work, deep Convolutional Neural Networks (CNNs) are shown to be a more effective model for fusing heterogeneous geospatial data to create radar-like analyses of precipitation intensity and storm height. The CNN trained in this effort has a directed acyclic graph (DAG) structure that takes inputs from multiple sources with varying spatial resolutions. These data sources include geostationary satellite data (1 km VIS and four 4 km IR bands), lightning flash density from Earth Network’s Total Lightning Network, and numerical model data from the National Oceanic and Atmospheric Administration 13 km Rapid Refresh Model. Training data for the model is sampled from areas over land and near shore so that NEXRAD can be used as ground truth in a regression model. The trained CNN is applied outside radar coverage and fused with analyses from NEXRAD to create a seamless radar mosaic that extends to offshore sectors and beyond. The model is validated using both land radar and spaceborne radar from NASA’s Global Precipitation Measurement Mission’s Core Observatory Satellite. We will discuss and demonstrate advantages that deep CNNs provide over the Random Forest model to perform the data fusion. Sensor pre-training and feature learning will also be discussed.