lmstudio-ai / lmstudio-bug-tracker

Bug tracking for the LM Studio desktop application
10 stars 3 forks source link

Support for json_object response_format #189

Open sabaimran opened 2 weeks ago

sabaimran commented 2 weeks ago

Hey folks! LMStudio offers an OpenAI API compatible local server. The OpenAI API offers a response_format.type of json_object, but it seems like the LMStudio proxy server doesn't recognize this type and throws an error. Here is the documentation.

Here's an error message from one of our users:

server-1    | [16:11:16.611978] DEBUG    khoj.processor.conversation.openai before_sleep.py:65
server-1    |                            .utils: Retrying
server-1    |                            khoj.processor.conversation.openai
server-1    |                            .utils.completion_with_backoff in
server-1    |                            0.31725788323339144 seconds as it
server-1    |                            raised BadRequestError: Error
server-1    |                            code: 400 - {'error':
server-1    |                            "'response_format.type' must be
server-1    |                            'json_schema'"}.

Could you add support for json_object? Or at least allow the request to pass through without returning an error? Thanks in advance.

This is for compatibility with https://github.com/khoj-ai/khoj.