oobabooga / one-click-installers

Simplified installers for oobabooga/text-generation-webui.
GNU Affero General Public License v3.0
550 stars 186 forks source link

Add check for compute support for GPTQ-for-LLaMa #104

Closed jllllll closed 1 year ago

jllllll commented 1 year ago

Your fork requires minimum compute of 6.0. Will install from main cuda repo if fork not supported.

Also removed cuBLAS llama-cpp-python installation in preparation for https://github.com/oobabooga/text-generation-webui/commit/4b19b74e6c8d9c99634e16774d3ebcb618ba7a18

oobabooga commented 1 year ago

Your fork requires minimum compute of 6.0. Will install from main cuda repo if fork not supported.

I wasn't aware of that, well done.

I have merged the new requirements.txt into main and will now merge this update as well.