DiffEqML / torchdyn

A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods
https://torchdyn.org
Apache License 2.0
1.36k stars 125 forks source link

Tutorials on Colab #77

Closed pharringtonp19 closed 3 years ago

pharringtonp19 commented 3 years ago

Because pytorch currently gets downgraded to 1.7.1 when we install torchdyn, this creates an issue with torchtext

For those who might face a similar problem, the solution is to add the following line

!pip install torchtext==0.8.1

Zymrael commented 3 years ago

Thanks for pointing this out: torchvision also requires 1.8.1 apparently. The issue appears linked to our torchsde dependency which clamps torch to <1.8.0. Pinging @lxuechen @patrick-kidger.

patrick-kidger commented 3 years ago

This is something I've been wanting to fix too.

@lxuechen, thoughts? I can see your comment on the matter here, but I can't duplicate the issue. Are you still able to repro it?

I can run pytest test_adjoint.py on my machine without issues, in particular using python=3.8.8, pytorch=1.8.0, and cudatoolkit=11.1.74, and the master version of torchsde. Full environment below.

# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                        main  
attrs                     20.3.0                   pypi_0    pypi
blas                      1.0                         mkl  
boltons                   20.2.1                   pypi_0    pypi
ca-certificates           2021.4.13            h06a4308_1  
certifi                   2020.12.5        py38h06a4308_0  
cudatoolkit               11.1.74              h6bb024c_0    nvidia
iniconfig                 1.1.1                    pypi_0    pypi
intel-openmp              2021.2.0           h06a4308_610  
ld_impl_linux-64          2.33.1               h53a641e_7  
libffi                    3.3                  he6710b0_2  
libgcc-ng                 9.1.0                hdf63c60_0  
libstdcxx-ng              9.1.0                hdf63c60_0  
libuv                     1.40.0               h7b6447c_0  
mkl                       2021.2.0           h06a4308_296  
mkl-service               2.3.0            py38h27cfd23_1  
mkl_fft                   1.3.0            py38h42c9631_2  
mkl_random                1.2.1            py38ha9443f7_2  
ncurses                   6.2                  he6710b0_1  
ninja                     1.10.2               hff7bd54_1  
numpy                     1.19.5                   pypi_0    pypi
openssl                   1.1.1k               h27cfd23_0  
packaging                 20.9                     pypi_0    pypi
pip                       21.0.1           py38h06a4308_0  
pluggy                    0.13.1                   pypi_0    pypi
py                        1.10.0                   pypi_0    pypi
pyparsing                 2.4.7                    pypi_0    pypi
pytest                    6.2.3                    pypi_0    pypi
python                    3.8.8                hdb3f193_5  
pytorch                   1.8.0           py3.8_cuda11.1_cudnn8.0.5_0    pytorch
readline                  8.1                  h27cfd23_0  
scipy                     1.5.4                    pypi_0    pypi
setuptools                52.0.0           py38h06a4308_0  
six                       1.15.0           py38h06a4308_0  
sqlite                    3.35.4               hdfb4753_0  
tk                        8.6.10               hbc83047_0  
toml                      0.10.2                   pypi_0    pypi
torchsde                  0.2.5                     dev_0    <develop>
trampoline                0.1.2                    pypi_0    pypi
typing_extensions         3.7.4.3            pyha847dfd_0  
wheel                     0.36.2             pyhd3eb1b0_0  
xz                        5.2.5                h7b6447c_0  
zlib                      1.2.11               h7b6447c_3
lxuechen commented 3 years ago

Thanks for the heads-up. Will be working on checking this tonight.

Zymrael commented 3 years ago

Everything works as intended now, thanks!