BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.2k stars 1.42k forks source link

[Bug]: Let proxy return JSON (for hugging face) #1654

Open danielvanmil opened 7 months ago

danielvanmil commented 7 months ago

What happened?

How to test:

You are a helpful assistent, that only communicates using JSON files.
        The expected output from you has to be: 
                "function_call":  {
            "name": {function_name},
            "args": [],
            "ai_notes": {explanation}
        }
{ 
  content: "....."
  role: "assistent"
}

instead of:

{ 
  function_call: "....."
  role: "assistent"
}

But it could also be possible it don't understand it completely, in that case an answer would be nice.,

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 7 months ago

Hey @danielvanmil i can't see if huggingface supports a json mode.

Are you suggesting if a function call is made by the user - dump the response (even if non-json) into the function_call field

krrishdholakia commented 7 months ago

Happy to hop on a call to talk through this - https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat