mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
23.13k stars 1.75k forks source link

Error generating images with LocalAI integrated with Nextcloud AI (CPU only) #3017

Open adripo opened 1 month ago

adripo commented 1 month ago

LocalAI version: localai/localai:master-ffmpeg-core

Environment, CPU architecture, OS, and Version:

Description:

I am encountering an error while generating images using LocalAI integrated with Nextcloud AI. The error appears in the LocalAI logs as follows:

ERR Server error error="could not load model: rpc error: code = Unknown desc = stat /build/models/stablediffusion: no such file or directory" ip=172.18.1.2 latency=2.004227305s method=POST status=500 url=/v1/images/generations

However, when I use the LocalAI web interface directly, image generation works fine.

Steps to Reproduce:

  1. Integrate LocalAI with Nextcloud AI.
  2. Attempt to generate an image through the Nextcloud AI interface.

Attempts to Resolve:

Observations:

Expected Behavior:

Image generation should work seamlessly when using Nextcloud AI integration with LocalAI, similar to when using the LocalAI web interface directly.

moellert commented 1 month ago

I can try to reproduce this on Monday. Which Nextcloud an app Version are you using and what are your settings for the integration? Can you get the request from the LocalAI log and not only the error message?

adripo commented 1 month ago

Sure.

I am using:

Here is the docker compose I am using:


services:
  db:
    image: postgres:16-alpine
    environment:
      - POSTGRES_DB=nextcloud
      - POSTGRES_USER=nextcloud
      - POSTGRES_PASSWORD=nextcloud
    volumes:
      - db:/var/lib/postgresql/data
    restart: unless-stopped

  app:
    image: nextcloud:29-apache
    environment:
      - POSTGRES_HOST=db
    ports:
      - 8000:80
    volumes:
      - nextcloud:/var/www/html
    restart: unless-stopped
    depends_on:
      - db

  local-ai:
    image: localai/localai:master-ffmpeg-core
    environment:
      #- REBUILD=true
      - CMAKE_ARGS=-DGGML_AVX512=OFF
    ports:
      - 8080:8080
    volumes:
      - ai_models:/build/models
    restart: unless-stopped

volumes:
  db:
  nextcloud:
  ai_models:

And I installed model stablediffusion-cpp from localai interface.

The error I see in nextcloud is:

API request error : Server error: `POST http://local-ai:8080/v1/images/generations` resulted in a `500 Internal Server Error` response: {"error":{"code":500,"message":"could not load model: rpc error: code = Unknown desc = stat /build/models/stablediffusio (truncated...)

The configuration is simple, just added the local url in the app settings http://local-ai:8080 and moved the selector for text completion endpoint to chat completions. It automatically set all the other fields.

I want to confirm that all the other features work well: translation, text processing, speech to text.

2024-07-28_01-54-47-408_brave

2024-07-28_01-57-00-193_brave

mudler commented 1 month ago

@adripo did you installed any model in LocalAI?

adripo commented 1 month ago

@mudler Yes, as I already mentioned in my previous response, I installed stablediffusion-cpp for image generation and it works fine from LocalAI interface, but will return that specific error when running from nextcloud. Text generation works well with any model.

And I installed model stablediffusion-cpp from localai interface.

moellert commented 1 month ago

I have the same Versions installed except I have local-ai:v2.19.2-ffmpeg and Nextcloud is not installed via docker. When I run the image generation the time localai returns:

10:38AM ERR failed starting/connecting to the gRPC service error="rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:40341: connect: connection refused\""
10:38AM DBG GRPC Service NOT ready
10:38AM ERR Server error error="grpc service not ready" ip=172.20.0.1 latency=40.353998174s method=POST status=500 url=/v1/images/generations

On the second try everything works in LocalAI but Nextcloud does not show the result. This is the request send from Nextcloud:

10:37AM DBG Request received: {"model":"","language":"","translate":false,"n":0,"top_p":null,"top_k":null,"temperature":null,"max_tokens":null,"echo":false,"batch":0,"ignore_eos":false,"repeat_penalty":0,"repeat_last_n":0,"n_keep":0,"frequency_penalty":0,"presence_penalty":0,"tfz":null,"typical_p":null,"seed":null,"negative_prompt":"","rope_freq_base":0,"rope_freq_scale":0,"negative_prompt_scale":0,"use_fast_tokenizer":false,"clip_skip":0,"tokenizer":"","file":"","size":"1024x1024","prompt":"cat","instruction":"","input":null,"stop":null,"messages":null,"functions":null,"function_call":null,"stream":false,"mode":0,"step":0,"grammar":"","grammar_json_functions":null,"backend":"","model_base_name":""}

It seems Nextcloud does not call for a model, in the past they asked for dall-e and this was not configurable. Maybe this helps to figure if this is a Nextcloud or LocalAI Problem.

manuelkamp commented 3 weeks ago

has anyone figured it out? just in case it helps, here is the full log, since the original post only holds truncated version:

OpenAI/LocalAI's text to image generation failed with: API request error: could not load model: rpc error: code = Unknown desc = stat /models/stablediffusion: no such file or directory

edit: I made a PR which fixes it if you are running "stablediffusion-cpp" model: https://github.com/nextcloud/integration_openai/pull/118