Closed NanoCode012 closed 1 year ago
I wonder if it's due to a push on the llama branch, try git checkout 68d640f7c368bcaaaecfc678f11908ebbd3d6176
in the transformers repo, then reinstall with pip install -e .
.
I wonder if it's due to a push on the llama branch, try
git checkout 68d640f7c368bcaaaecfc678f11908ebbd3d6176
in the transformers repo, then reinstall withpip install -e .
.
Hello, thank you @AmericanPresidentJimmyCarter . This solved the issue.
Thanks, I have updated the README.
Hello, thank you for the repo.
I attempted to run the bot with the same prompt as the example (as well as others), but got the error below. I was wondering if you have any ideas?
Reproduce:
Additional context:
I tried to clone the latest https://github.com/qwopqwop200/GPTQ-for-LLaMa, copied the current
engine.py
from this repo, and edited gptq.py import a bit for quant.py. The issue persists.