I am training a RNN with a Sequencer but at test time i would like to send my data sequentially, so i remove the Sequencer layer and forward each sample individually in a loop. The problem is, the results are different when i send the same tensor to the network wrapped in the Sequencer or not. Both network are in evaluation mode and the data is totally equal. Can someone explain that behaviour ?
I am training a RNN with a Sequencer but at test time i would like to send my data sequentially, so i remove the Sequencer layer and forward each sample individually in a loop. The problem is, the results are different when i send the same tensor to the network wrapped in the Sequencer or not. Both network are in evaluation mode and the data is totally equal. Can someone explain that behaviour ?
Thanks !