AI-Based Approach Speeds and Stabilizes 5G/6G Wireless Systems
Cellular wireless networks continue to extend bandwidths and frequency ranges as they add users, but they also face increasingly complex signal interconnections along the way. Fortunately, researchers at Incheon National University (INU), Incheon, Republic of Korea, studied the problem and learned how artificial intelligence (AI) can improve 5G and 6G wireless system performance by correcting potential network errors.
With a transformer model driven by AI, the researchers were able to predict signal patterns from rapidly moving users and their devices. The innovative model tracks short- and long-term signal patterns and prioritizes key signal parameters to aid the operation of cellular wireless base stations.
The researchers’ AI-driven approach reduces errors and enhances network reliability even for high-speed data links. It targets and organizes channel state information (CSI) at base stations of wireless networks, including 5G and 6G networks. The technique was recently presented in Vol. 23, No. 12 of IEEE Transactions on Wireless Communications, “Transformer-Assisted Parametric CSI Feedback for mmWave Massive MIMO Systems.”
Accurate Channel State Information is Essential
As 5G and 6G wireless networks operate at higher frequencies, they employ higher-frequency technologies, such as mmWave and massive multiple-input, multiple-output (mMIMO) components and systems. For expanded capacities and coverage areas, such wireless networks must maintain accurate CSI at the base station. Due to the amount of signals being processed and antennas needed to transfer them, errors readily occur without proper precision.
By exploring a parametric CSI feedback technique, the researchers were able to compress complex mmWave MIMO channel matrices into less complex configurations requiring fewer antennas. The innovative approach enables accurate and dependable signal interconnections at a 5G base station with just a few geometric channel parameters, such as signal angles, delays, and path gains (see image above).
Minimizing the number of required channel parameters compared to the number of antennas reduces CSI feedback overhead. It can be managed with a deep-learning technique capable of channel parameter extraction and MIMO-channel reconstruction, even for large numbers of base-station antennas.
Compared to conventional CSI feedback methods, the INU researchers’ parametric approach has been found to improve normalized-mean-square-error (NMSE) and bit-error-rate (BER) performance. Measurements of the novel CSI feedback approach reveal a 3.5-dB improvement in BER compared to conventional CSI feedback methods. The technique can provide improved data reliability even for rapidly moving users, backing emerging communications applications such as vehicle-to-everything (V2X) communications and maritime networks.
Leading the INU team, Professor Byungju Lee, currently an Associate Professor with INU’s Department of Information and Telecommunication Engineering (and lead author of the IEEE report), detailed the challenges at rising 5G frequencies: “To address the rapidly growing data demand in next-generation wireless networks, it is essential to leverage the abundant frequency resource in the mmWave bands. In mmWave systems, fast user movement makes this channel aging a real problem.”
He explains how the innovative approach offers a solution, “Our method ensures precise beamforming, which allows signals to connect seamlessly with devices, even when users are in motion.” The novel CSI feedback technique enables uninterrupted wireless connections even for rapidly moving users.