SakanaAI / AI-Scientist

The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery 🧑‍🔬
Apache License 2.0
8.26k stars 1.16k forks source link

litellm.BadRequestError #151

Open NAMEKDSUN opened 1 week ago

NAMEKDSUN commented 1 week ago

litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=glm-4-plus Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

https://docs.litellm.ai/docs/providers

Tokens: 10k sent, 0 received

I tried to expand other models for the project, and this part of the code is throwing an error. Where in the code is litellm being used? How should I resolve this issue?