yoheinakajima / babyagi

MIT License
19.66k stars 2.57k forks source link

problem with local Wizard-Vicuna-13B-Uncensored #316

Closed aseok closed 1 year ago

aseok commented 1 year ago

Hi facing following error for local Wizard-Vicuna-13B-Uncensored.ggml.q4_1:

error loading model: llama.cpp: tensor 'layers.27.attention_norm.weight' is missing from model llama_init_from_file: failed to load model Traceback (most recent call last): File "babyagi.py", line 118, in llm = Llama( File ".../.local/lib/python3.8/site-packages/llama_cpp/llama.py", line 161, in init assert self.ctx is not None AssertionError

aseok commented 1 year ago

seems a reboot solved it.