Hey folks! LMStudio offers an OpenAI API compatible local server. The OpenAI API offers a response_format.type of json_object, but it seems like the LMStudio proxy server doesn't recognize this type and throws an error. Here is the documentation.
Here's an error message from one of our users:
server-1 | [16:11:16.611978] DEBUG khoj.processor.conversation.openai before_sleep.py:65
server-1 | .utils: Retrying
server-1 | khoj.processor.conversation.openai
server-1 | .utils.completion_with_backoff in
server-1 | 0.31725788323339144 seconds as it
server-1 | raised BadRequestError: Error
server-1 | code: 400 - {'error':
server-1 | "'response_format.type' must be
server-1 | 'json_schema'"}.
Could you add support for json_object? Or at least allow the request to pass through without returning an error? Thanks in advance.
Hey folks! LMStudio offers an OpenAI API compatible local server. The OpenAI API offers a
response_format.type
ofjson_object
, but it seems like the LMStudio proxy server doesn't recognize this type and throws an error. Here is the documentation.Here's an error message from one of our users:
Could you add support for
json_object
? Or at least allow the request to pass through without returning an error? Thanks in advance.This is for compatibility with https://github.com/khoj-ai/khoj.