Open aframires opened 1 year ago
Thanks for letting us know. I tried your code on my side and it is normal with the latest version (1.1.0). Could you perhaps give me more information or could you try on CPU (not on WSL2) and see if you get same error?
I'll try on another computer tomorrow and let you know. Meanwhile I've tried by setting device = torch.device("cpu")
before the breaking line and it didn't fix it. Any other relevant information I can provide to you in the meantime?
I'm running version 1.1.0 as well and my pytorch version is 1.11.0+cu113
I think the issue was related to mismatch between numba and numpy versions, I think this can be easily solved if you force versions on the requirements.txt, specially for numpy.
Hi, could you let us know the constraint of numpy and numba version numbers? We will consider forcing the version in the requirement, but different users might have different pytorch version so it can be a bit hard.
Antonio Ramires @.***> 于2023年3月23日周四 20:48写道:
I think the issue was related to mismatch between numba and numpy versions, I think this can be easily solved if you force versions on the requirements.txt, specially for numpy.
— Reply to this email directly, view it on GitHub https://github.com/LAION-AI/CLAP/issues/83#issuecomment-1482099087, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGXJZ7Y6HFYHG7HQD3PXLLLW5TVMTANCNFSM6AAAAAAWFSQJ4E . You are receiving this because you were assigned.Message ID: @.***>
When installing the requirements on a fresh virtual environment, it gets numba 0.56.4
and numpy 1.24.2
A clear message pops up during the installation (which I had previously ignored) which is the following:
ERROR: numba 0.56.4 has requirement numpy<1.24,>=1.18, but you'll have numpy 1.24.2 which is incompatible.
Installing any numpy version below 1.24 (I installed 1.23.5) should work. Therefore, if the requirements file has numpy<1.24
instead of just numpy
, it should work for everyone, both fresh installs and people with a python environment already set up.
Hope it helps!
Thanks for the feedback! @RetroCirce could you please add that to the requirement? Many thanks!
Fixed!
Hi guys! I was wondering if it would be okay for me to submit a PR to update the lock on numpy? I'm trying to use ChromaDB to store my CLAP embeddings but it has a dependency on a library that requires numpy>=1.24.2 (the last row below)
#0 10.17 The conflict is caused by:
#0 10.17 The user requested numpy
#0 10.17 chroma-hnswlib 0.7.3 depends on numpy
#0 10.17 chromadb 0.4.15 depends on numpy>=1.22.5; python_version >= "3.8"
#0 10.17 h5py 3.10.0 depends on numpy>=1.17.3
#0 10.17 laion-clap 1.1.4 depends on numpy==1.23.5
#0 10.17 librosa 0.10.1 depends on numpy!=1.22.0, !=1.22.1, !=1.22.2 and >=1.20.3
#0 10.17 numba 0.58.1 depends on numpy<1.27 and >=1.22
#0 10.17 onnxruntime 1.16.1 depends on numpy>=1.24.2
The numba library has an update to support numpy 1.24 and does not mention breaking changes: https://numba.readthedocs.io/en/stable/release-notes.html#version-0-57-0-1-may-2023
Maybe we can make the numpy>=1.23.5 or just remove the locked version altogether?
I'm still pretty new to python (I come from a node background). Let me know if this package resolution would be okay?
Thank you!
Hi @uncvrd thanks for the update! Can you (or anyone else @RetroCirce @aframires ) confirm that there is no longer a conflict on numpy and numba? If so you can open a PR and I will approve. Sorry, I don't have much throughput recently to check on my own.
Hello everyone, thanks for the great contribution that is CLAP!
I'm running into the following error when running CLAP:
This happens on the line 8 (
model = laion_clap.CLAP_Module(enable_fusion=True)
) of:I've installed CLAP by running
and it is running on WSL2.
I've tried to find similar errors on the issue list but could not find any. Sorry if it is repeated.