Traditional numerical solvers for linear algebra problems such as dictionary learning (DictL) operating on large data face challenges because of super-linear computational complexity and massive memory requirement. This does not make it easily solvable on commonly available computers. A linear neural network (NN) is proposed to be employed as an alternate solver for such problems. Specifically, we demonstrate a linear fully-connected (FC) NN-based solver for DictL. It is employed to learn the dictionary atoms instead of the classically used k-singular value decomposition (K-SVD), while the sparse coefficients are learned using the classical orthogonal matching pursuit (OMP) approach. We compare the computational complexity of the classical vs our proposed FCNN-based approach implemented on both CPU and GPU using synthetically generated datasets with varying sizes. Further, we demonstrate practical utility in image denoising with DictL while comparing the equivalence of solution obtained from our FCNN-based solver with the traditional approach with K-SVD. We achieve a notable speedup compared to the traditional technique on both CPU and GPU implementations.