TianLin0509 / DNN_detection_via_keras

This is the simplest implementation of Power of Deep Learning for Channel Estimation and Signal Detection in OFDM Systems using keras.
165 stars 67 forks source link

DNN_detection_via_keras

This is the simplest implementation of Power of Deep Learning for Channel Estimation and Signal Detection in OFDM Systems using keras. I tried my best to simplify the codes, so that everyone can follow it easily. The original tensorflow version codes can be referred to here. Compared with other frameworks (e.g., tensorflow, pytorch, MXNet and so on), this keras-version is the simplest realization.

Some reference

According to many readers comments, I have written a simple blog of this paper, which may be helpful for Chinese reseachers to understand the main idea of this paper, you can find the blog in blog address

First

Some common problems are answered in the issue,hopefully it can help you. Besides, if this work helps you, please kindly star or fork the repo to support me.

Requirement

tensorflow-gpu >= 1.12.0 As the codes are written before the publication of tensorflow 2.0.

data sets

I have uploaded the required data sets in BaiduYun Drive

password: 1234

As some readers mentioned, I also provided the download url for Google driver.

which are generated by saving the numpy arrays loaded from original provided .txt files.

Then, directly move the channel_train.npy and channel_test.npy to current file. Namely, the paths are './channel_train.npy' and './channel_test.npy'.

Original datasets is provided in https://github.com/haoyye/OFDM_DNN as txt.file, which may cost much time to load the data. Therefore, I save enough samples as the .npy files, so that the training sets can be loaded easily and also reduce the file size.

How to use

After downloaded and moved the data sets, just run main.py directly.

Some evaluation

Since this repo is just a reproduction, so I follow the original idea of the author: generate random init bits, simulate the channel by loading data from the .npy file, and then build the neuron network to recover bits from the received bits.

I know some readers want to directly apply the detection neuron network to replace their traditional receiver, for comparisons and so on. It is much easy to do with this codes. In brief, the codes for generated data is not needed. You can just save your original bits and receive signal of your own system as a .mat file (if you use Matlab) or .npy file. Then, load the data by Python and use the .fit function, where original_bits is the label and receiver signal is exactly the input of the network. You even do not need to simulate the channel (as you do it in your previous work and only receive signal is required).

Sorry for my English. If you have any problem, please contact me via my email. Hopefully it is helpful for you and if possible, star or fork this repo to support.