Closed asdfsa1314 closed 1 year ago
I apologize for the confusion, but I have moved wheels to the releases section here: https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels
The wheels there have been updated with generate_bug_report_information()
overhauled for Windows compatibility.
I have now updated the wheels in the repo itself so that people still using them aren't stuck with an outdated wheel.
generate_bug_report_information()
now uses where /R
on Windows systems, which is largely equivalent to Unix's find
.
Additionally, it searches the additional paths that I added to the Windows version. https://github.com/jllllll/bitsandbytes/commit/024acbac1477d03b4011fcb4b54f3221cc7727ce
The reason for the change is that newer wheels are larger than the repo file limit of 100 megabytes.
Additionally, if you truly want to use CUDA 12.1, then you should install the nightly version of Pytorch:
python -m pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121
My whl does have some problems, I used the auto-gptq project Thanks for the addition, and praise for your efforts to make bitsandbytes compatible on windows I fully understand the inconvenience caused by warehouse restrictions
I solved it, the general plan is as follows because cuda is 12.1 so , torch install
Need to be guaranteed torch installed is success use below shell checkout torch install:
about bitsandbytes I use below shell
However, there is a problem with this bag. Although he is a bag under Windows, there are some hard codes in the code int the win platorm ,use below shell show us error
so, the problem
I hope this can help with me a question