tambetm / matlab2048

2048 playing agent using deep Q-learning in Matlab.
38 stars 17 forks source link

What is NN structure? #1

Closed JBMing closed 8 years ago

JBMing commented 8 years ago

I did't find the information of neural network such as the number of layers and the neurons in each layer.Thus,can u explain the explicit information in readme?

tambetm commented 8 years ago

Currently the NN has only fully connected layers. The sizes of hidden layers are passed to NNAgent constructor as opts.layers parameter. For example opts.layers = [1000]; creates one hidden layer with 1000 nodes, opts.layers = [256 256]; creates two hidden layers with 256 nodes each. README seems outdated, but example.m should be OK. Will fix it ASAP.

JBMing commented 8 years ago

Do you mean just one fully connected layers and not have convolutional layer? Besides, I am confused by the readme and don't how execute the code. Whether should I install some other environment?

tambetm commented 8 years ago

You can have several fully connected layers by including many numbers in list, i.e. opts.layers = [256 256]. But no convolutional layers.

I would suggest to check out example code in https://github.com/tambetm/matlab2048/blob/master/example.m, that should work.

You need to check out DeepLearnToolbox from GitHub as suggested in README. No other packages are needed.

JBMing commented 8 years ago

I am trying to modify the algorithm of weight updation. But I have question for the code. Where is the code that achieve back=propagation and gradient descent ? Could u tell me ?

tambetm commented 8 years ago

Backpropagation and gradient descent are implemented in DeepLearnToolbox. I'm using my own fork, which has some minor modifications, not relevant to this project AFAIK. The backpropagation is implemented in NN/nnff.m, NN/nnbp.m and NN/nnapplygrads.m.