Closed flxst closed 1 month ago
This PR fixes #203.
The problem was caused by a breaking change in torch 2.4 regarding the way caching of device_count works, compare https://pytorch.org/docs/2.3/_modules/torch/cuda.html#device_count and https://pytorch.org/docs/2.4/_modules/torch/cuda.html#device_count
device_count
none except for the above
none
python tests/tests.py
What does this PR do?
This PR fixes #203.
The problem was caused by a breaking change in torch 2.4 regarding the way caching of
device_count
works, compare https://pytorch.org/docs/2.3/_modules/torch/cuda.html#device_count and https://pytorch.org/docs/2.4/_modules/torch/cuda.html#device_countGeneral Changes
none except for the above
Breaking Changes
none
Checklist before submitting final PR
python tests/tests.py
)