mudler / LocalAI

:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
https://localai.io
MIT License
21.59k stars 1.65k forks source link

Feature request: Request the ability to upload a private SSL certificate provided by a secure solution for downloading models #2542

Open seonghobae opened 3 weeks ago

seonghobae commented 3 weeks ago

Is your feature request related to a problem? Please describe.

I'm always frustrated when I tried to download a model, where https://www.soosanint.com/contents.php?con_id=solution2_eng network solution used: this issue discussed in https://github.com/mudler/LocalAI/discussions/2446

Describe the solution you'd like

Describe alternatives you've considered

Additional context

Discussed in https://github.com/mudler/LocalAI/discussions/2446

Originally posted by **seonghobae** May 31, 2024 I'm encountering the following error while trying to download a model file with LocalAI: ``` Error failed to download file "/build/models/LocalAI-Llama3-8b-Function-Call-v0.2-q4_k_m.bin": Get "https://huggingface.co/mudler/LocalAI-Llama3-8b-Function-Call-v0.2-GGUF/resolve/main/LocalAI-Llama3-8b-Function-Call-v0.2-q4_k_m.bin": tls: failed to verify certificate: x509: certificate signed by unknown authority ``` Here is the **`docker-compose.yml`** file I am using: ``` services: localai-api: image: localai/localai:latest-aio-gpu-nvidia-cuda-12 #localai/localai:latest-aio-cpu # For a specific version: # image: localai/localai:v2.16.0-aio-cpu # For Nvidia GPUs decomment one of the following (cuda11 or cuda12): # image: localai/localai:v2.16.0-aio-gpu-nvidia-cuda-11 # image: localai/localai:v2.16.0-aio-gpu-nvidia-cuda-12 # image: localai/localai:latest-aio-gpu-nvidia-cuda-11 # image: localai/localai:latest-aio-gpu-nvidia-cuda-12 healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8080/readyz"] interval: 1m timeout: 20m retries: 5 ports: - 8080:8080 environment: - DEBUG=true # ... volumes: - ./models:/build/models:cached restart: always ``` I suspect this issue is related to a network SSL transparency security solution that requires the installation of a private certificate. In other Node.js projects, I have resolved similar issues by building the Dockerfile separately to include the necessary certificates. How can I resolve this issue for LocalAI?
mudler commented 3 weeks ago

@seonghobae we have a custom-ca-certs top-level folder: https://github.com/mudler/LocalAI/tree/master/custom-ca-certs did you tried to put your certificates in there and rebuild the container image? You should also be able to mount the path of the container to a directory with your certs

seonghobae commented 2 weeks ago

@seonghobae we have a custom-ca-certs top-level folder: https://github.com/mudler/LocalAI/tree/master/custom-ca-certs did you tried to put your certificates in there and rebuild the container image? You should also be able to mount the path of the container to a directory with your certs

Thanks, I see that, but it's a pain in the arse to compile that way. Is the only option really to do the compilation myself?

tom-chamberlain-glitch commented 3 days ago

@seonghobae I was able to resolve by mounting directly to the ssl folder: -v /folder/containing/crt/files:/etc/ssl/certs