f-dangel / backpack

BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.
https://backpack.pt/
MIT License
558 stars 55 forks source link

add support for torch 2.0? #303

Closed yahui624 closed 1 year ago

yahui624 commented 1 year ago

hi, right now it has a limitation of satisfies the requirement torch<1.13.0,>=1.9.0 (from backpack-for-pytorch). Will you be adding support for torch2.0 soon?

f-dangel commented 1 year ago

Hi,

thanks for your question. To the best of my knowledge, torch 2.0 is fully backward-compatible with the latest 1.xx, but introduces a new compile function. So in principle, BackPACK should support 2.0. The reason we do not officially support 2.0 is that we have not tested how compile interferes with the forward and full backward hooks that BackPACK uses under the hood.

If you run into issues while using BackPACK with torch 2.0, please let us know.

Best, Felix

yahui624 commented 1 year ago

Hi, i m experimenting my model on nvidia h100/sm 90, which requires pytorch to build from source. inside that conda environment, i do pip install pip install backpack-for-torch, it will auto degrade my torch version to 1.13, how can i stop that from happening?

f-dangel commented 1 year ago

Hi,

you can pip install backpack-for-pytorch first, then pip uninstall torch and install PyTorch by building locally.

yahui624 commented 1 year ago

awesome thanks!

hlzl commented 1 year ago

@f-dangel I can confirm from my testing that it works with torch 2.0.1. However, using pip to install backpack now automatically downgrades your torch version to 1.13.0, which is kind of annoying. It would thus be great if you could change the requirement to torch>=1.9.0 or torch<2.1.0,>=1.9.0 as an intermediate solution.

f-dangel commented 1 year ago

Hi @hlzl,

if you install the latest version of BackPACK (1.6.0) it should work with torch>=2

Best, Felix

hlzl commented 1 year ago

Ah sorry, seems like I downloaded it just before the bump to 1.6.0. Thanks for letting me know!