In the context of learning and inference over non-Gaussian additive noise processes encountered in modern circuits and systems, several non-Bussgang learning criteria have emerged, such as, the maximum correntropy, minimum error entropy, and the maximum Versoria criterion. However, these existing learning criteria are known to depend on hyperparameters, such as, the shape and the spread parameters. Besides, some of these learning methods are known to depend on suitable informationpotential (IP) choices for general non-Gaussian noisestatistics. This work proposes an online hyperparameterfree criterion learning algorithm that comprehensively alleviates dependence on hyperparameter choices and learns the IP by self-adapting to underlying noise distributions. For the proposed hyperparameter-free criterion learning, analytical results are derived, and case-studies are presented for its validation.