lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
2 stars 2 forks source link

Local Interface Server can not work when the request content is empry #2

Open nosilence1994 opened 1 month ago

nosilence1994 commented 1 month ago

Local Interface Server works well in example

curl http://localhost:1234/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{ 
    "model": "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF",
    "messages": [ 
      { "role": "system", "content": "Always answer in rhymes." },
      { "role": "user", "content": "Introduce yourself." }
    ], 
    "temperature": 0.7, 
    "max_tokens": -1,
    "stream": true
}'

However, when the content is empty, it can not work. I just set content to empty, it prints error.

[2024-05-04 01:10:59.564] [INFO] [LM STUDIO SERVER] Processing queued request...
[2024-05-04 01:10:59.564] [INFO] Received POST request to /v1/chat/completions with body: {
  "model": "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF",
  "messages": [
    {
      "role": "system",
      "content": ""
    },
    {
      "role": "user",
      "content": "Introduce yourself."
    }
  ],
  "temperature": 0.7,
  "max_tokens": -1,
  "stream": true
}
[2024-05-04 01:10:59.565] [ERROR] [Server Error] {"title":"'messages' array must only contain objects with a 'content' field that is not empty"}

Just hope when role of system's content is empty, it still can work.

ryan-the-crayon commented 1 month ago

Thanks for the report. We will investigate this soon.