Open IsItDanOrAi opened 2 weeks ago
Since you didn't give me more information, I can't judge your problem. You can update the party again. Recently, I removed a lot of dependencies that are easy to report errors.
Had a couple of users report problems with new installs, the actual issue being this:
Installing llama-cpp-python...
Looking in indexes: https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX2/cu124, https://pypi.ngc.nvidia.com
ERROR: Could not find a version that satisfies the requirement llama-cpp-python (from versions: none)
ERROR: No matching distribution found for llama-cpp-python
Comfy recently went from CUDA 12.1 to 12.4. I upgraded from 12.1 to 12.4 and so had no such issues myself, but after uninstalling llama-cpp-python and re-installing LLM party, I was able to re-create the issue (Looks like its llama_index?).
Note also that this error doesn't occur when you pip install -r requirements.txt, but instead at ComfyUI startup where it does some requirements checks.
This doesn't stop LLM party from loading, and I've not seen what doesn't work without it yet though :)
I set this dependency to be unnecessary. If it is not installed correctly, only the LVM loader will fail, and it will have no effect on other nodes. The reason for the error is that this dependency is not adapted to cuda 12.4.
Just an update. I wanted to say thank you to both of you.
Heshangtao. The node seems to install perfectly now. I uninstalled LLM-Party, and manually installed via Git Pull, as was mentioned, and it seems to work perfectly. The changes definitely helped, as I had tried this before. Thank you for the updates. Sorry if my original post didn't provide enough information to help locate the issue. But, I'm glad you were able to figure it out.
NerdyRodent. Just wanted to say thank you for the clarification on the error, and for all your videos. Great work, and invaluable to the community.
Now when installing llama cpp python, if you install the cuda124 version, it will automatically change to install the cuda122 version. I tested it and it should be solved. If there is no problem, I will close this issue in three days.
Updated ComfyUI and it broke LLM-Party. Anyone have any idea the cause?
Python 3.11.8 Win10 Pytorch 2.3.0+cu121 Torchvision 0.18.0+121 xformers 0.0.26.post1 numpy 1.26.4 pillow - n/a OpenCV 4.10 transformers 4.41.1 diffusers 0.30.1 Cuda 12.1