Closed jerome-wego closed 3 weeks ago
@jerome-wego Thanks for raising this issue! Currently guardrails does not explicitly support LiteLLM's Router API internally. In order to use litellm routers with the current version you will need to wrap the method as a custom llm. See here for more details: https://www.guardrailsai.com/docs/how_to_guides/llm_api_wrappers#build-a-custom-llm-wrapper
That said, we will take adding support for this into consideration.
Thanks for pointing me in the right direction + the prompt response @CalebCourier, I greatly appreciate it!
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 14 days.
This issue was closed because it has been stalled for 14 days with no activity.
Issue
Guardrails is not detecting LiteLLM Router API as a LiteLLM API-compatible LLM API.
Expected usage
Where
router.acompletion
is treated the same aslitellm.acompletion
.Error
Currently using Guards as described above results in an error
Suspect changes needed here
https://github.com/guardrails-ai/guardrails/blob/5363c0793a6b155d7f72eb85c7504079bac61c2a/guardrails/llm_providers.py#L923