Open Clad3815 opened 4 days ago
const response = await openai.moderations.create({ input: text, // model: "omni-moderation-latest" model: "text-moderation-latest" });
can i see how you set this up on your config? would help to repro
I'm using the Docker version, and here is my config:
- model_name: omni-moderation-latest
litellm_params:
model: openai/omni-moderation-latest
- model_name: text-moderation-stable
litellm_params:
model: openai/text-moderation-stable
- model_name: text-moderation-latest
litellm_params:
model: openai/text-moderation-latest
What happened?
When using the moderation from openai, littlellm seem to send the prefix "openai/" to the openai api
Relevant log output
Twitter / LinkedIn details
No response