johnsmith0031 / alpaca_lora_4bit

MIT License
533 stars 84 forks source link

module 'alpaca_lora_4bit.quant_cuda' has no attribute 'vecquant4recons_v2' #147

Closed kevkid closed 1 year ago

kevkid commented 1 year ago

Cant seem to run the trainer as in finetune.py Do I need to do something special? Do I need gptq4llama installed? module 'alpaca_lora_4bit.quant_cuda' has no attribute 'vecquant4recons_v2' I see that inside alpaca_lora_4bit.quant_cuda there are cpp files, how would I use/compile these?

johnsmith0031 commented 1 year ago

Maybe the cuda kernel is not up to date. You can uninstall alpaca_lora_4bit first and reinstall it.

kevkid commented 1 year ago

Maybe the cuda kernel is not up to date. You can uninstall alpaca_lora_4bit first and reinstall it.

I uninstalled all packages in the conda environment that is running in WSL, then did pip install -r requirements.txt in the repos folder, and I specifically removed wandb. I then proceded to install the repo by doing pip install . The only thing that is installed are the requirements of the repo. how can I ensure the cuda kernel is reinstalled (or is it reinstalled via the requirements.txt?)

image

johnsmith0031 commented 1 year ago

Weird. Maybe you should try this:

python setup.py install

but may conflict with the version that pip installed.

kevkid commented 1 year ago

Weird. Maybe you should try this:

python setup.py install

but may conflict with the version that pip installed.

Thank you this worked!