Closed joeyballentine closed 4 months ago
Can @torch.inference_mode()
and model.eval()
negatively affect performance if the model already under inference mode?
I haven't tested it, but I don't believe so.
For the record, I'm pretty sure we call that multiple times in chaiNNer. And the inference mode thing is meant to be used individually each time the model is ran. Check the docs.
Generally speaking, it's always good to put a model in inference mode when performing inference. I figure it's probably good to do this automatically when using the call api to prevent possible problems.
Could theoretically be related to #160 but I think they are doing the right things there so I don't think tat's it