Closed jmtatsch closed 1 year ago
We shouldn't make llama a requirement for people who are fine with OpenAI only. It's pretty heavy install. :-(
We shouldn't make llama a requirement for people who are fine with OpenAI only. It's pretty heavy install. :-(
may be create a separate service ?
llama-cpp-python==0.1.36 should be added to requirements.txt
babyagi service in docker-compose.yml needs opened up memory limits: ulimits: memlock: -1
made a docker example here (gpu support)
https://github.com/yoheinakajima/babyagi/issues/275#issuecomment-1720168614
llama-cpp-python==0.1.36 should be added to requirements.txt
babyagi service in docker-compose.yml needs opened up memory limits: ulimits: memlock: -1