litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=glm-4-plus
Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers
I tried to expand other models for the project, and this part of the code is throwing an error. Where in the code is litellm being used? How should I resolve this issue?
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=glm-4-plus Pass model as E.g. For 'Huggingface' inference endpoints pass in
completion(model='huggingface/starcoder',..)
Learn more: https://docs.litellm.ai/docs/providershttps://docs.litellm.ai/docs/providers
Tokens: 10k sent, 0 received
I tried to expand other models for the project, and this part of the code is throwing an error. Where in the code is litellm being used? How should I resolve this issue?