Closed vinhqdang closed 8 years ago
Hey, its not necessary to convert to binary first, this project grew out of a proof-of-work. The standard RNN problem to train a (3 node) RNN to learn how to carry a 1.
To sum, it was the first application, the functions are still included for the example. Have a look at another example here:
Hi, I think there is a problem with the example (no 'rnn' package). Am I right?
And could I set number of layers in RNN?
I am not sure about no 'rnn' package, do you mean this:
Error in library(rnn) : there is no package called ‘rnn’
If so, you need to install the package, this is explained in the README at the bottom of the page.
Once you have installed the package and loaded it using library(rnn)
have a look at the help page using:
help('trainr')
Yes, I saw the parameter _hiddendim, which I think it is the size of RNN, but not sure if it is the number of layers?
We are mixing two discussion threads. For running your example on Shiny, what should I do? I clicked to the center figure and it led me to http://shiny.qua.st:3838/finance/, and there is an error message as you showed.
I installed rnn on my local R and am running with this, but how could I install it on your server? Sorry, I am not very familiar with Shiny.
ah sorry, that must've been the update to R 3.3.0, I reinstalled rnn, checkout the shiny app again.
If you've installed the development (github version) on your system then you can use multiple layers using:
hidden_dim = c(8,5,3)
etc.
Great
It means the RNN will have 3 layers, with the size is 8, 5 and 3 respectively, right?
That's right, it means the first hidden layer is of size 8, the second of size 5 and the third of size 3. Would you mind posting your question on Stackoverflow? That way others can also benefit from this.
Hi,
I submitted a question on StackExchange here:
http://stats.stackexchange.com/questions/218231/understanding-the-example-of-rnn-package-in-r
Hi,
I think this question is still an open one
Hello
Could you give some explanations of what you did with RNN? It seems to me that it is quite different from RNN implemented in Lua (https://github.com/jcjohnson/torch-rnn): for instance, why do you need to convert everything to binary first?