BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.57k stars 1.71k forks source link

[Feature]: Add safety setting feature for PaLM model #1227

Closed tikendraw closed 10 months ago

tikendraw commented 11 months ago

The Feature

There is a safety setting in google.generativeai api. Add Add saftey settings as optional parameters to the Palm config class.

Motivation, pitch

To add safety checks.

motin commented 10 months ago

I think this already works and has been working for quite some time. If you add the safety_settings kwarg to the completion() function when using a palm model, it will be forwarded to Palm Config by this line: https://github.com/BerriAI/litellm/blob/d90e04b531692dcb8672909f526c234d0f828595/litellm/llms/palm.py#L119

krrishdholakia commented 10 months ago

closing as this is resolved then?

krrishdholakia commented 10 months ago

Although i do believe we could have better docs on this.

@tikendraw @motin we'd welcome a doc contribution here

krrishdholakia commented 10 months ago

if you can share a working code snippet i can add it to the docs - https://docs.litellm.ai/docs/providers/palm