I updated latestmerge branch gptq-for-llama and latestgptq branch KoboldAI and now KoboldAI gives me this error:
Traceback (most recent call last):
File "aiserver.py", line 604, in <module>
from modeling.inference_models.hf_torch_4bit import load_model_gptq_settings
File "/mnt/Storage/KoboldAI/modeling/inference_models/hf_torch_4bit.py", line 37, in <module>
from gptq.bigcode import load_quant as bigcode_load_quant
ModuleNotFoundError: No module named 'gptq.bigcode'
I updated latestmerge branch gptq-for-llama and latestgptq branch KoboldAI and now KoboldAI gives me this error:
I also installed the latest hf_bleeding_edge.