advanced-stack / py-llm-core

A pythonic library providing light-weighted interface with LLMs
MIT License
103 stars 8 forks source link

Bug: Aborted (core dumped) with llava #4

Closed Leon-Sander closed 7 months ago

Leon-Sander commented 7 months ago

I followed the instructions and tried to run the code presented here: https://advanced-stack.com/resources/multi-modalities-inference-using-mistral-ai-llava-bakllava-and-llama-cpp.html

I am encountering the error: Aborted (core dumped)

Any ideas?

paschembri commented 7 months ago

It would be useful to have more logs. You can set verbose to True in the llama_cpp_kwargs.

If this doesn't give you insights call the python interpreter with strace

You can post here the logs

Leon-Sander commented 7 months ago

I found the error, I set the clip model path to also just the name, thought the lookup would happen there as well.