An unsupervised machine learning algorithm utilizing nonlinear dimensionality reduction has been developed to interface with spectroscopic measurements from an atomic rubidium optical magnetometer. The magnetometer uses Electromagnetically Induced Transparency (EIT) optical spectroscopy with atomic vapor cells in order to observe magneto-sensitive spin transitions of the 87Rb atom. The Rb EIT measurements result in the observation of seven transmission peaks associated with the available two-photon resonances between the Rb D1 line Zeeman sublevels. The developed algorithm correlates the waveform of the EIT spectra and the directionality of the local magnetic field using kernel principal component analysis (KPCA). The analysis, implemented with a radial basis function (RBF) kernel trick, was found to neatly cluster the EIT spectra according to the direction of the local magnetic field in the reduced dimensional feature space. The KPCA projections demonstrated a smooth curve in a three-dimensional feature space, and the resulting curve was modeled with a support vector regression (SVR) machine. We found that if the direction of the magnetic field is undefined initially, the measurement accuracy is better than 3 degrees in both azimuthal and longitudinal angles with respect to the real direction of the magnetic field. If the magnetometer is configured so that the azimuthal angle of the field is defined, the KPCA- SVR algorithm is capable of predicting the longitudinal angle of the local magnetic field within 1 degree of accuracy. These results suggest that the dimensionality reduction methods may be a powerful tool for approaching regression problems in optical spectroscopy.