Tu Ngo Hoang

and 1 more

In this study, we investigate multihop multipleinput multiple-output (MIMO) full-duplex relay (FDR) networks with short-packet ultra-reliability and low-latency communications (uRLLCs), where transmit-antenna selection/maximumratio combining and transmit-antenna selection/selection combining are leveraged as diversity techniques. Under quasistatic Rayleigh-fading channels and the finite-blocklength theory, the end-to-end block-error rate (BLER) performance of the considered FDR is analyzed and compared with that of halfduplex relaying systems, from which the effective throughput (ETP), energy efficiency (EE), reliability, and latency are also investigated. To analyze the diversity order and gain further insights into the system design, an asymptotic BLER analysis in the high signal-to-noise ratio regime is provided. However, the derived analytical expressions contain non-elementary functions, making them intricate for practical implementations, particularly in real-time configurations. Therefore, we have introduced a deep multiple-output neural network with short execution time, low computational complexity, and highly accurate estimation to overcome this hurdle. This network can serve as an efficient tool to rapidly respond to the necessary system parameters, such as transmit power and blocklength, when the services request specific ETP, EE, reliability, and latency. To corroborate the correctness of the theoretical analysis, extensive simulation results are provided under varying impacts of system parameters.

Tu Ngo Hoang

and 2 more

In this work, we investigate short-packet communications for multiple-input multiple-output underlay cognitive multihop relay internet-of-things (IoT) networks with multiple primary users, where IoT devices transmit and receive short packets to provide low-latency and ultra-reliable communications (uRLLCs). For performance evaluation, the closed-form expressions of the end-to-end (E2E) block error rate (BLER) for the considered systems are derived in a practical scenario under imperfect channel state information of the interference channels, from which the E2E throughput, energy efficiency (EE), latency, reliability, and asymptotic analysis are also studied. Based on the analytical results, we adapt some state-of-the-art machine learning (ML)-aided estimators to predict the system performance in terms of the E2E throughput, EE, latency, and reliability for real-time configurations in IoT systems. We also obtain the closed-form expressions for the optimal power-allocation and relay-location strategies to minimize the asymptotic E2E BLER under the proportional tolerable interference power and uRLLC constraints, which require negligible computational complexity and offer significant power savings. Furthermore, the ML-based evaluation achieves equivalent performance while significantly reducing the execution time compared to conventional analytical and simulation methods. Among the ML frameworks, the extreme gradient boosting model is demonstrated to be the most efficient estimator for future practical IoT applications.