In notebook Character_Level_RNN_Soulution file:
Step: Defining the network with PyTorch
The architecture of the network includes two layer LSTM, however, in class CharRNN, the forward function only has one LSTM layer.
def forward(self, x, hidden):
''' Forward pass through the network.
These inputs are x, and the hidden/cell state `hidden`. '''
## TODO: Get the outputs and the new hidden state from the lstm
r_output, hidden = self.lstm(x, hidden)
## TODO: pass through a dropout layer
out = self.dropout(r_output)
# Stack up LSTM outputs using view
# you may need to use contiguous to reshape the output
out = out.contiguous().view(-1, self.n_hidden)
## TODO: put x through the fully-connected layer
out = self.fc(out)
# return the final output and the hidden state
return out, hidden
If I was correct, there should be two LSTM layers in forward function. Currently, one one layer here.
In notebook Character_Level_RNN_Soulution file: Step: Defining the network with PyTorch The architecture of the network includes two layer LSTM, however, in class CharRNN, the forward function only has one LSTM layer.
If I was correct, there should be two LSTM layers in forward function. Currently, one one layer here.