Tu Ngo Hoang

and 2 more

This work investigates cellular millimeter-wave (mmWave) massive multiple-input multiple-output (MIMO) systems within the open radio access network (O-RAN) architecture, integrating the compatible spectrum, air interface, and networking entities of beyond fifth-generation wireless networks. To overcome O-RAN fronthaul (O-FH) load limitations and the short wavelength inherent in mmWave bands, we design a hybrid beamforming architecture with digital and analog beamformers generated at the O-RAN distributed unit and O-RAN radio unit, respectively. Using the information theory, we develop non-grid-of-beams analog beamformers to maximize the sum-spectral efficiency (SE) under constant-modulus constraints. For digital precoding, we apply a successive convex approximation method with second-order cone program procedures to maximize sum-SE, while addressing transmit power and limited O-FH load constraints, and ensuring user quality of service requirements. Sub-optimal digital combiners are also designed based on the inherent characteristics of the user side. However, the current optimization approach suffers from long execution times, posing challenges for near-real-time beamforming configurations. To address this issue, we propose an efficient deep learning (DL)-based digital precoding scheme with short execution time, low computational complexity, and high performance. Numerical results demonstrate that the proposed DL-based precoding scheme provides superior performance compared to benchmark schemes and offers scalability to massive MIMO configurations.

Tu Ngo Hoang

and 2 more

In this work, we investigate short-packet communications for multiple-input multiple-output underlay cognitive multihop relay internet-of-things (IoT) networks with multiple primary users, where IoT devices transmit and receive short packets to provide low-latency and ultra-reliable communications (uRLLCs). For performance evaluation, the closed-form expressions of the end-to-end (E2E) block error rate (BLER) for the considered systems are derived in a practical scenario under imperfect channel state information of the interference channels, from which the E2E throughput, energy efficiency (EE), latency, reliability, and asymptotic analysis are also studied. Based on the analytical results, we adapt some state-of-the-art machine learning (ML)-aided estimators to predict the system performance in terms of the E2E throughput, EE, latency, and reliability for real-time configurations in IoT systems. We also obtain the closed-form expressions for the optimal power-allocation and relay-location strategies to minimize the asymptotic E2E BLER under the proportional tolerable interference power and uRLLC constraints, which require negligible computational complexity and offer significant power savings. Furthermore, the ML-based evaluation achieves equivalent performance while significantly reducing the execution time compared to conventional analytical and simulation methods. Among the ML frameworks, the extreme gradient boosting model is demonstrated to be the most efficient estimator for future practical IoT applications.