Open vmajor opened 1 year ago
This could be related to outdated llama-cpp-python
which does not support the latest quantization methods. Running this locally pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir
fixed the issue. I will see if I can rebuild the docker container by editing requirements.txt
Trying to run babyagi with
docker-compose up
results inAssertionError: Model can't be found.
Model path is correct in the host file system. It makes me think that the dockerised babyagi is looking for the model inside the docker image.