idiap / fast-transformers

Pytorch library for fast transformer implementations
1.65k stars 179 forks source link

Got different result for the same batch #123

Closed gaoshan2006 closed 1 year ago

gaoshan2006 commented 1 year ago

Even having calling both "with torch.no_grad()" and "model.eval()", my model(fast-transformer) for the same batch input could get the different output. It is somethings like "y = model(x), then z = model(x) ", then i find y and z not same. If I want to get the same result in my inference part, what ese api of fast-transformers should I use prior to "y=model(x)" ? Thanks

gaoshan2006 commented 1 year ago

Have gotten a solution, The problem could be resolved by setting deterministic_eval to True for feature_map. Close this issue. Thanks