wandb / openui

OpenUI let's you describe UI using your imagination, then see it rendered live.
https://openui.fly.dev
Apache License 2.0
18.96k stars 1.74k forks source link

Running docker the second time, with a different API key throws invalid model name error #185

Open zoultrex opened 2 months ago

zoultrex commented 2 months ago

After trying out running openUI with gemini I decided to give it a go using groq, but after running the command

docker run --rm --name openui -p 7878:7878 -e GROQ_API_KEY=API_KEY_HERE_ETC -e OLLAMA_HOST=http://host.docker.internal:11434 ghcr.io/wandb/openui

There were some errors on the terminal complaining that the model name passed is not gemini, do I need to clear the container images every time I try a different API key?

  File "/app/openui/server.py", line 215, in chat_completions
    raise HTTPException(status_code=e.status_code, detail=msg)
fastapi.exceptions.HTTPException: 400: Error code: 400 - {'error': {'message': {'error': 'chat_completion: Invalid model name passed in model=gemini-1.5-pro'}, 'type': 'None', 'param': 'None', 'code': 400}}
vanpelt commented 2 months ago

You'll need to change the model to use a groq model from the settings button. That error is saying it attempted to use gemini-1.5-pro but that doesn't exist because no GEMENI API key was passed into the container. There's likely a bug where the UI caches the last model used even if it's no longer available, but simply refreshing the UI and selecting the Groq model from the settings button should do it.

zoultrex commented 2 months ago

After refreshing everything to start again and selecting llama3 groq 70b or 8b I get this error message now.

Error! 400 Error code: 400 - {'error': {'message': 'message[1].content must be a string', 'type': 'invalid_request_error'}}

image

The settings: image