KoboldAI / KoboldAI-Client

For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp
https://koboldai.com
GNU Affero General Public License v3.0
3.46k stars 747 forks source link

docker-cuda.sh issues #333

Open jason-brian-anderson opened 1 year ago

jason-brian-anderson commented 1 year ago

i solved this problem by moving USER root and the apt-get package installs to the top of the docker file, but i still somehow cannot connect to the container from the host even though it says the service is up via docker-compose. So i don't think i solved it correctly...

here was the original problem:

#0 294.4 Installing pip packages: flask-cloudflared==0.0.10, flask-ngrok, lupa==1.10, transformers==4.24.0, huggingface_hub==0.12.1, safetensors, accelerate, git+https://github.com/VE-FORBRYDERNE/mkultra
#0 294.8 ERROR: Could not open requirements file: [Errno 2] No such file or directory: '/home/micromamba/mambafKektToRoF5'
#0 295.1 critical libmamba pip failed to install packages
------
failed to solve: executor failed running [/usr/local/bin/_dockerfile_shell.sh micromamba install -y -n base -f /home/micromamba/env.yml]: exit code: 1
jason-brian-anderson commented 1 year ago

kind of wondering if the micromamba base is the source of the issues.