Woolverine94 / biniou

a self-hosted webui for 30+ generative ai
GNU General Public License v3.0
358 stars 32 forks source link

ModuleNotFoundError: No module named 'llama_cpp' #26

Closed woehrer12 closed 1 week ago

woehrer12 commented 1 month ago

Describe the bug I installed it with docker on an ubuntu 22 server Then i get this error

`

[biniou 🧠]: Detected TCMalloc_minimal installation : using it. Traceback (most recent call last): File "/home/biniou/biniou/webui.py", line 14, in from ressources import File "/home/biniou/biniou/ressources/init.py", line 2, in from .llamacpp import File "/home/biniou/biniou/ressources/llamacpp.py", line 3, in from llama_cpp import Llama ModuleNotFoundError: No module named 'llama_cpp' `

Woolverine94 commented 1 month ago

Hello @woehrer12,

Thanks for your feedback and your interest in biniou.

It complains about not finding llama-cpp-python package. It was probably not installed during building of the container image.

Could you :

CPU-only :

docker build --no-cache --progress=plain -t biniou https://github.com/Woolverine94/biniou.git 2>&1 | tee build.log

CUDA :

docker build --no-cache --progress=plain -t biniou https://raw.githubusercontent.com/Woolverine94/biniou/main/CUDA/Dockerfile 2>&1 | tee build.log

and post here the build.log content ?

woehrer12 commented 1 month ago

I used this for the installation: https://github.com/Woolverine94/biniou#dockerfile

It runs as a virtual maschine as a CT on Proxmox. Its a Ubuntu 22 and on this is docker installed.

build.log

Woolverine94 commented 1 month ago

Thanks for your answer @woehrer12.

I don't get it. There's no error in your log.

Llama-cpp-python seems to be installed and should be working.

Installing the Dockerfile inside a virtual machine can sometimes provoke errors at compilation time of llama-cpp-python, but that doesn't seems to be the case here.

Woolverine94 commented 1 week ago

Closing this issue as there has been no updates for more than a month.

Please don't hesitate to re-open it if required.