rentruewang / koila

Prevent PyTorch's `CUDA error: out of memory` in just 1 line of code.
https://rentruewang.com/koila/
Apache License 2.0
1.82k stars 62 forks source link

Compatibility with python 3.7? #7

Open nbroad1881 opened 2 years ago

nbroad1881 commented 2 years ago

I'm wondering what makes it incompatible with python 3.7 because that is the python version I am using and I can't upgrade to 3.8

rentruewang commented 2 years ago

Personally I use the walrus operator := a lot and it's only supported in 3.8 or above. I guess I just didn't use 3.7 on local when developing this project.

However, supporting 3.7 would be a great idea, as PyTorch itself supports 3.6, 3.7, 3.8. So I'll be working on it when I've got time.

nbroad1881 commented 2 years ago

Is there anything besides the walrus operator that makes it incompatible with 3.7?

rentruewang commented 2 years ago

There is a @final decorator that isn't available in 3.7. Other than that and walrus operator, it should be compatible with 3.7.

nbroad1881 commented 2 years ago

Protocol in typing is also 3.8

rentruewang commented 2 years ago

Oh, I didn't know that. Thanks!

nielsrolf commented 2 years ago

This would be great - colab also runs on Python 3.7 and I could imagine that many people would like to use koila there

lukeleeai commented 2 years ago

I agree. As who use colab / kaggle notebooks, it's sad that I can't use Koila cuz they use Python 3.7 :( Making it compatible with 3.7 would potentially have more people use your package. I just posted the issue that pip can't find Koila on colab / kaggle, but is it because of the version incompatibility?