abdeladim-s / pyllamacpp

Python bindings for llama.cpp
https://abdeladim-s.github.io/pyllamacpp/
MIT License
63 stars 22 forks source link

Process finished with exit code 139 (interrupted by signal 11: SIGSEGV) #8

Open tgreenwood opened 1 year ago

tgreenwood commented 1 year ago

I am not sure where exactly the issue comes from (either it is from model or from pyllamacpp), so opened also this one https://github.com/nomic-ai/gpt4all/issues/529

I tried with GPT4All models (for, instance https://huggingface.co/nomic-ai/gpt4all-13b-snoozy)

I am able to run this model as well as lighter models, but in about 2-4 promts given to the model (in the process of answering) it fails with "Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)". If provide max allowed prompt (±4000 tokens), then it fails with the first request to generate a responce. The same behavior for all gpt4all models downloaded 2-3 days ago. I am running it on Macbook Pro M1 (2021), 16 GB RAM. Tried python from 3.9 to 3.11. Also, tried with Jupyter lab (kernel 3.10), PyCharm and terminal. It is all the same. pyllamacpp is of 2.1.3 version Any ideas where the problem may come from?

I traced calls and found the exact code line where it failes:

`call, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:225 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:226 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:227 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:230 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:183 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:184 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:185 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:186 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:187 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:188 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:189 line, /opt/homebrew/anaconda3/envs/gpt4all-converted_conda/lib/python3.10/site-packages/pyllamacpp/model.py:185

Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)`

This is in the generate method at calling C-code as far as I can judge: pp.llama_eval(self._ctx, predicted_tokens, len(predicted_tokens), self._n_past, n_threads)