Rangeet Mitra

and 3 more

Additive non-Gaussian noise processes and nonlinear channel characteristics are found to present a performance bottleneck for next-generation communication systems, since most classical links are currently designed with Gaussian and linear system assumption. The joint effect of these impairments not only make channel characterization difficult, but also make optimal receiver design non-trivial and complex. In such scenarios, the joint application of reproducing kernel Hilbert space (RKHS) based machine learning methods, and information-theoretic learning (ITL) criteria, allows for near-optimal mitigation of channel nonlinearity and non-Gaussian noise for upcoming beyond 5G systems. Across different deployment scenarios, RKHS based learning allows for hyperparameter independence, improves generalization in the low-data regime and complements other existing signal processing solutions such as deep learning. Also, existing ITL criteria based algorithms are found to deliver improved performance for machine learning over non-Gaussian additive noise scenarios. Furthermore, existing analytical frameworks for these RKHS based approaches readily allow for the extension of classical performance analysis methods; existing works report rigorous analytical results, and the near-optimality of RKHS based approaches. This paper highlights the benefits and the potential of hyperparameter free RKHS and ITL based methods for simple and generic mitigation of unknown underlying hardware impairments and noise processes.