Closed paul-gauthier closed 3 months ago
acknowledging this
received_args={'response_object': {'id': None, 'choices': None, 'created': None, 'model': None, 'object': None, 'service_tier': None, 'system_fingerprint': None, 'usage': None, 'error': {'message': '{"type":"error","error":{"type":"invalid_request_error","message":"Output blocked by content filtering policy"}}', 'code': 400}}, 'model_response_object': ModelResponse(id='chatcmpl-b88ce43a-7bfc-437c-b8cc-e90d59372cfb', choices=[Choices(finish_reason='stop', index=0, message=Message(content='default', role='assistant'))], created=1719376241, model='None/anthropic/claude-3.5-sonnet', object='chat.completion', system_fingerprint=None, usage=Usage()), 'response_type': 'completion', 'stream': False, 'start_time': None, 'end_time': None, 'hidden_params': None}
Looks like it's raised when a bad request error is received
@paul-gauthier also looked at the way openrouter returns errors - added a check to raise this correctly https://github.com/BerriAI/litellm/commit/ca04244a0ab76291a819f0f9a475f5e0706d0808
What happened?
With
litellm==1.40.26
I am seeing an UnboundLocalError coming out oflitellm.completion()
.There are multiple errors stacked up here. But I think it starts from an AssertionError:
Relevant log output
Twitter / LinkedIn details
No response