Cambridge-ICCS / FTorch

A library for directly calling PyTorch ML models from Fortran.
https://cambridge-iccs.github.io/FTorch/
MIT License
48 stars 11 forks source link

Operate under `Inference Mode` as well as No Grad #112

Open jatkinson1000 opened 2 months ago

jatkinson1000 commented 2 months ago

Originally commented by @ElliottKasoar as part of #103

"Looking at the docs we'd maybe want to create an optional argument bool :: inference mode and if set to true enable c10::InferenceMode for the function as a block.

jatkinson1000 commented 2 months ago

Copying @ElliottKasoar's comment from #81

  1. InferenceMode
    • See: inference mode, autograd mechanics and the dev podcast
    • From our benchmarking (see FTorch with InferenceMode and NoGradMode sections), benefits were less clear, but in general it is expected to be at least as fast
    • Tests were carried out by replacing torch::AutoGradMode enable_grad(requires_grad); with c10::InferenceMode guard(requires_grad); in all ctorch.cpp functions, but ideally both options would be presented to users
    • This mode was only added (as a beta) in PyTorch 1.9, so we would need to consider support for older versions
    • The mode is also much stricter than NoGradMode, so cannot be used in all cases