This study introduces a machine learning approach to address the critical challenge of limited real-time flow data in river basins, particularly for calibrating large-scale hydrologic models. These models often rely on uncertain parameter transfers from gauged to ungauged regions, hindering accurate predictions. To mitigate this, we propose utilizing long short-term memory (LSTM) networks to estimate historic streamflow in ungauged watersheds, effectively creating “surrogate gauges.” Our innovative method treats watersheds as interconnected systems, leveraging downstream flow information to constrain and improve upstream flow estimates. By training and testing the LSTM on synthetic networks of conceptual ”leaky bucket” watershed models, we demonstrate its ability to outperform traditional methods. Notably, by representing watersheds as paired upstream-downstream networks, this concept can be generalized to any ungauged portion of a real basin, enhancing flow estimates in data-scarce scenarios. This research represents a proof of concept, highlighting the potential for significantly enhancing hydrologic modeling in data-scarce regions and providing a scalable method for watershed network analysis. Future work will focus on applying the model to real-world basins to validate its performance and scalability for continental-scale applications. Ultimately, this research aims to contribute to the development of more accurate and reliable hydrologic predictions.