KoboldAI / KoboldAI-Client

For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp
https://koboldai.com
GNU Affero General Public License v3.0
3.51k stars 758 forks source link

libmamba failed to execute pre/post link script for cudatoolkit #299

Open antage opened 1 year ago

antage commented 1 year ago

When I tried to build docker-cuda image I get the following error:

0 122.2
#0 122.2 error    libmamba response code: -1 error message: Invalid argument
#0 122.2 critical libmamba failed to execute pre/post link script for cudatoolkit
------
failed to solve: process "/usr/local/bin/_dockerfile_shell.sh micromamba install -y -n base -f /home/micromamba/env.yml" did not complete successfully: exit code: 1

OS: Archlinux Nvidia driver version: 530.41.03

P.S. oobabooga/text-generation-webui and AUTOMATIC1111/stable-diffusion-webui work fine in docker on my system.

ParetoOptimalDev commented 10 months ago

I also get this error with:

OS: NixOS Nvidia Driver Version: 545.29.06

ParetoOptimalDev commented 10 months ago

I found a related issue:

https://github.com/mamba-org/micromamba-docker/issues/368

But it was closed in favor of:

https://github.com/mamba-org/mamba/issues/2501

It recommends workaround:

docker build --ulimit nofile=262144:262144 ...

So I'm trying:

modified   docker-cuda/docker-compose.yml
@@ -2,6 +2,10 @@ version: "3.2"
 services:
   koboldai:
     build: .
+    ulimits:
+      nofile:
+        soft: 262144
+        hard: 262144
     environment:
       - DISPLAY=${DISPLAY} 
     network_mode: "host"

And re-running now.

ParetoOptimalDev commented 10 months ago

That didn't fix it, same error.