This is the simulation code for the paper "Beamforming Design for Large-Scale Antenna Arrays Using Deep Learning". This paper is published on IEEE Wireless Communication Letters.
IEEE link: https://ieeexplore.ieee.org/document/8847377/
Arxiv link: https://arxiv.org/abs/1904.03657
I recommend the pre-print version on Arxiv.
Also, a Chinese-version blog can be referred to CSDN blog
Now it supports tf 2.3.0, just run the file train_v2.py
Main revision is that the API batch_dot is different from tensorflow 1
(Tensorflow 1.12.0 is better for debugging, while tensorflow 1.13.0 using cuda10 can run faster)
If you are confused about how to have several different tensorflows and cudas of different versions in one computer, there is a easy guide may help you (in Chinese).
After fork the repo and download the corresponding data sets and trained models, the following performance results can be easily reproduced. (the python codes is only for the blue cerves, and compared cerves should be plot via Matlab codes)
Due to some readers requirements, for Chinese people, the BaiduYun URL is also provided. (password: z9un).
Some coding tricks are used to fit the Keras framework, for example, the loss function is written in an unique way, which is described in the issues and have been questioned many times.
For many readers requirements, I have updated the matlab code for samples generation. Please kindly refer to the gen_samples.m for details. The codes are based on the work [2].
lint17@fudan.edu.cn
for help.