mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
https://localai.io
MIT License
25.82k stars 1.94k forks source link

Feature: Support and/or documentation on rootless approach local-ai docker #3677

Open DanielBoettner opened 1 month ago

DanielBoettner commented 1 month ago

Is your feature request related to a problem? Please describe.

"The problem" I encounter is that the files like images which created are created by root. When mounting the output to the host, the files are owned by root. This can be fixed with a simple "sudo chown" manually.

I played around with "ns-remap" but had no real success so far. I also tried an approach to create a Dockerfile and use "localai/localai:latest-aio-gpu-nvidia-cuda-12" (in my case) as a base. Then adding another user. But I ran into permission issues.

I'm not familiar enough with the LocalAI to make a good guess on the effort needed to "switch" this to a non root user that could be mapped to the host user.

Describe the solution you'd like

A common approach is to allow linking USER_ID and GROUP_ID (typically 1000 and 1000) to a/the user that calls the python scripts.

Describe alternatives you've considered

No other current idea on how to solve this.

Additional context

Native Ubuntu 24.04 and a WSL Ubuntu 24.04 (on Win11)

michaelwhitford commented 1 month ago

I have this working with rootless podman on Linux Mint 22. The following systemd container file is how I start/stop the service, using systemctl --user start localai as the unprivileged user.

~/.config/containers/systemd/localai.container:

[Unit]
Description=localai container
After=local-fs.target network-online.target

[Container]
AddDevice=nvidia.com/gpu=all
AutoUpdate=registry
Environment=DEBUG=true
Environment=LOCALAI_SINGLE_ACTIVE_BACKEND=true
Environment=LOCALAI_F16=true
Environment=LOCALAI_THREADS=16
Image=quay.io/go-skynet/local-ai:v2.20.1-cublas-cuda12-ffmpeg
Label=app=localai
PublishPort=5000:8080
# not needed for the localai container
# everything is done as root inside the container which is the default mapping
#UserNS=keep-id
Volume=/ai/localai/models:/build/models:Z

[Service]
TimeoutStartSec=900

[Install]
WantedBy=multi-user.target default.target