The detection of faults in photovoltaic (PV) systems plays a major role in extending the lifetime of the employed modules and maintaining the system's efficiency. Machine learning (ML) algorithms are one of the most popular approaches for fault detection in PV systems. In this paper, it is proposed a novel approach called Logistic Model Tree (LMT) algorithm for the analysis of I-V curves of line-to-line fault, shading fault, degradation fault, and open circuit fault on the DC side of the PV systems. At first, a 2x2 PV system is built in MATLAB Simulink environment to generate I-V curves under healthy and different fault conditions and as a result, an imbalanced fault dataset is created. The imbalanced fault dataset is divided into training and testing datasets and then k-means under sampling method is used to resolve the class imbalance problem. LMT and other popular machine learning algorithms that are used in PV fault detection are trained on the training dataset with the 5-fold cross-validation method with grid-search and their performances are evaluated on the testing dataset. In the end, Matthew's Correlation Coefficient (MCC), Cohen's Kappa, and Macro F-1 of the algorithms is computed from the confusion matrices, and the performances of the algorithms are compared with each other. According to the findings, the LMT algorithm emerges as the top performer with the highest MCC (0.8657443), Kappa (0.8609745), and F-1 scores (0.8539005), solidifying its place as the most effective machine learning algorithm for detecting faults in PV systems.