Closed Leon-Sander closed 7 months ago
It would be useful to have more logs. You can set verbose to True in the llama_cpp_kwargs.
If this doesn't give you insights call the python interpreter with strace
You can post here the logs
I found the error, I set the clip model path to also just the name, thought the lookup would happen there as well.
I followed the instructions and tried to run the code presented here: https://advanced-stack.com/resources/multi-modalities-inference-using-mistral-ai-llava-bakllava-and-llama-cpp.html
I am encountering the error: Aborted (core dumped)
Any ideas?