Automated anomaly detection in time-series data has important applications in modern society. In practical settings, sequence patterns must often be evaluated in real-time under limited computational resources and scarce training data, presenting significant challenges. Reservoir computing models can help reconcile the trade-off between computational expense and precision in such situations. We propose the Mahalanobis distance of reservoir states (MD-RS) method, as an alternative to standard reservoir-based methods that rely on prediction error. MD-RS can effectively highlight anomalies because it measures the deviation from the subspace that minimally encompasses reservoir responses to normal inputs. Our semi-supervised MD-RS method only requires training data labeled as normal, making it suitable for practical scenarios under limited training data. Both the training and inference can be performed online with reduced computational cost, enabling real-time anomaly detection. We apply the MD-RS method to a benchmark dataset with a short training phase from the UCR time series anomaly archive and demonstrate its superiority in terms of accuracy, reliability, and robustness over prediction error-based methods when employing the same reservoir. Additionally, we propose the response dimension increase (RDI) index for initial hyperparameter tuning, which measures the increase in the effective dimension of the reservoir trajectories. These results position our method as a leading choice for applications requiring rapid and accurate anomaly detection.