Development of ultra-compact, low-to-medium precision analog-to-digital converters (ADCs) with unprecedented energy-efficiency is essential to meet the ever-increasing demand for data converters in advanced computing systems including neuromorphic accelerators based on emerging non-volatile memories. To this end, in this work, for the first time, we propose a feedforward neural network ADC based on a network of highly scalable, CMOS-compatible, and energy-efficient ferroelectric-FinFET (Fe-FinFET) synaptic elements. Our lower triangular neural network (LTNN) ADC design, implemented using 7-nm technology from ARM along with an experimentally calibrated compact model for Fe-FinFETs, consumes 5.44 μW of power, 1.03 μm2 of area while operating at a speed of 1.23 megasamples per second for 4-bit precision. The proposed neural network ADC may pave the way for realization of highly efficient neuromorphic processing engines and neuro-optimizers based on cross-point array of emerging non-volatile memories.