mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
24.59k stars 1.88k forks source link

ErrorCode 500, cannot unmarshal array into Go struct field Message.messages.content of type string #1383

Closed monsterlady closed 11 months ago

monsterlady commented 11 months ago

LocalAI version:

quay.io/go-skynet/local-ai:v1.40.0-cublas-cuda12-ffmpeg

Environment, CPU architecture, OS, and Version:

Linux c970feda3bb9 5.15.133.1-microsoft-standard-WSL2 #1 SMP Thu Oct 5 21:02:42 UTC 2023 x86_64 GNU/Linux

Describe the bug

  1. follow the doc to setup GPT vision
  2. create yaml file
    backend: llama
    context_size: 4096
    f16: true
    threads: 1
    gpu_layers: 90
    mmap: true
    name: llava
    roles:
    user: "USER:"
    assistant: "ASSISTANT:"
    system: "SYSTEM:"
    parameters:
    model: ggml-model-q4_k.gguf
    temperature: 0.2
    top_k: 40
    top_p: 0.95
    template:
    chat: chat-simple
    mmproj: mmproj-model-f16.gguf
  3. docker-compose up -d
  4. send request by postman curl --location 'http://localhost:7080/v1/chat/completions' \ --header 'Content-Type: application/json' \ --data '{ "model": "llava", "messages": [ { "role": "user", "content": [ { "type": "text", "text": "What is in the image?" }, { "type": "image_url", "image_url": { "url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg" } } ], "temperature": 0.9 } ] }'
  5. get error { "error": { "code": 500, "message": "failed reading parameters from request:json: cannot unmarshal array into Go struct field Message.messages.content of type string", "type": "" } }

To Reproduce

above

Expected behavior

should have returned 200 with text

Logs

Additional context

mudler commented 11 months ago

llava is supported only on master and on 2.0. Closing the issue