Open siddBanPsu opened 5 years ago
In the RNN code here: https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html,
self.i2h = nn.Linear(input_size + hidden_size, hidden_size) self.i2o = nn.Linear(input_size + hidden_size, output_size)
there seem to be no activation present, tanh or Relu as is normally seen in RNN's. Why was it done like this for this example?
In the RNN code here: https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html,
there seem to be no activation present, tanh or Relu as is normally seen in RNN's. Why was it done like this for this example?