Closed Abolfazl-Alipour closed 3 years ago
Hi!
Thank you for using EchoTorch. RRCell is only for ESN training based on inverse matrix with SGD so it is coherent that there is not gradient at the end.
You can use ESNCell as there is not RRCell, and add a linear layer (or any other layer) and pack this into a torch module inherited class. Then use SGD, but be aware that ESN are very difficult to train with SGD.
Regards
Hi guys, I am working on this nice package (and thank you Nils for making this available) I want to add one FC layer to the network. however, when data is fed to the network, the output does not have the gradient attached to it and hence, it loses information that is needed for backpropagation. I am able to incorporate ESNCell.py into my network and run backprop in it. however, if I use the ridge regression cell (RRCell.py) it generates an output that does not have the gradient and therefore no backprop on its output. I used both ESN.py , LiESN.py, and after calling the finalize method, and then testing the network, the output does not have the gradient. for example:
esn= esn = etnn.LiESN(input_dim=input_dim, hidden_dim=n_hidden, output_dim=1, spectral_radius=spectral_radius, learning_algo='inv', leaky_rate=leaky_rate) esn(inputs,targets) esn.finalize() y_predicted=esn(inputs) and then y_predicted is just a tensor without any gradient attached to it
any suggestions?
P.S.
the model I want to build is: fc1==>ESN==>output
If I put requires_grad=True in either line 73 or line 149(which will be w_out or output), I can get gradient in the output, but I am not sure if it is a good idea to do so :|