Symmetric nonnegative matrix factorization (Sym-NMF) is an unsupervised method that seeks to approximate a symmetric matrix A, often a similarity or kernel matrix, by the product of low-rank factors. Most existing models deal with a factorization of the form A ≈ HH T , which may not yield a satisfactory approximation in cases where the matrix is indefinite. To address this limitation, this paper proposes a new SymNMF model based on the more general formulation A ≈ HSH T , that is more stable. To ensure a low-rank representation on the positive semi-definite cone, the LogDet divergence is incorporated as a regularization term. An alternative optimization algorithm is proposed to optimize the proposed objective. Experimental results on various synthetic and real-world datasets demonstrate that the proposed model outperforms or performs comparably to many state-of-the-art methods while maintaining superior computational efficiency.