BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.62k stars 1.47k forks source link

[Feature]: Allow system messages + chat messages in palm api #542

Open krrishdholakia opened 12 months ago

krrishdholakia commented 12 months ago

The Feature

Palm allows this via 'context' and 'messages'. Allow this instead of passing the input in as 'prompt'.

Motivation, pitch

Improve the palm api support.

Twitter / LinkedIn details

No response

krrishdholakia commented 12 months ago

reference: https://developers.generativeai.google/api/rest/generativelanguage/models/generateMessage

krrishdholakia commented 12 months ago
Screenshot 2023-10-05 at 7 25 41 PM

Same for vertex ai

adi-kmt commented 11 months ago

So you want to add here, context and examples are used from the init right?

SujanShilakar commented 4 months ago

@krrishdholakia can you look for my pr for the issue https://github.com/BerriAI/litellm/pull/3718

SujanShilakar commented 3 months ago

@krrishdholakia / @ishaan-jaff fixes for #3718.


request_payload = {
    "context": "This is a test context.",
    "messages": [
        {"author": "user", "content": "User message 1."},
        {"author": "system", "content": "System message 1."},
        {"author": "user", "content": "User message 2."}
    ],
    "temperature": 0.7,
    "candidate_count": 1,
    "top_k": 40,
    "top_p": 0.9,
    "max_output_tokens": 100
}
'''