Deep learning for mmWave and THz beamforming applications
The millimeter-wave (mm-Wave) massive MIMO communications/radar employ hybrid analog-digital beamforming architectures to reduce the cost-power-size-hardware overheads. Lately, there is also a gradual push to move from the millimeter-wave (mmWave) to Terahertz (THz) frequencies for short-range communications and radar applications to exploit very wide THz bandwidths. At THz, ultramassive MIMO is an enabling technology to exploit even wider bandwidth while employing thousands of antennas. The design of the hybrid beamforming techniques requires the solution to difficult nonconvex optimization problems that involve a common performance metric as a cost function and several constraints related to the employed communication regime and the adopted architecture of the hybrid system(s). There is no standard methodology for solving such problems and usually, the derivation of an efficient solution is a very challenging task. Since optimization-based approaches suffer from high computational complexity and their performance strongly relies on the perfect channel condition, we introduce deep learning (DL) techniques that provide robust performance while designing a hybrid beamformer. In this lecture, the audience will learn about applying DL to various aspects of hybrid beamforming including channel estimation, antenna selection, wideband beamforming, and spatial modulation. In addition, we will examine these concepts in the context of joint radar-communications architectures.