I think there is something missing. İf I understand correctly implementation of forward is wrong.
At code of forward of LSTM model, we see h_0=torch.zeros(...) and c_0 = torch.zeros(...) which means you set state everytime same, not get any information of sequence state. İs there anything ı am missing?
I think there is something missing. İf I understand correctly implementation of forward is wrong. At code of forward of LSTM model, we see h_0=torch.zeros(...) and c_0 = torch.zeros(...) which means you set state everytime same, not get any information of sequence state. İs there anything ı am missing?