potpie-ai / potpie

AI agents for your codebase
https://potpie.ai
Apache License 2.0
244 stars 18 forks source link

"LLM Provider NOT provided" Error for Non-OpenAI Models in Litellm #156

Open dhirenmathur opened 1 week ago

dhirenmathur commented 1 week ago

Description: CrewAI uses the Litellm library to route LLM requests to the appropriate model. Currently, Litellm throws an error, "LLM Provider NOT provided", whenever a request is made for a non-OpenAI model, preventing proper routing to other providers.

Expected Behavior: Litellm should handle non-OpenAI models without throwing an error, allowing CrewAI to route requests to the correct LLM as specified.

Steps to Reproduce:

palash018 commented 5 days ago

I will try to reproduce this issue, currently facing some issues regarding running code on wsl.

palash018 commented 4 days ago

image I'm encountering a connection error after running ./start.sh. It seems that: POSTGRES_SERVER=postgresql://postgres:mysecretpassword@host.docker.internal:5432/momentum is not resolving to the proper IP in WSL. I'm using the non-WSL version of that variable, which resolves the issue, but now I'm getting an incorrect password error.

although when I connect to psql via shell and force it to ask for password even if I type random password, it connects

is this a known issue @dhirenmathur?

dhirenmathur commented 3 days ago

Not a known issue @palash018 , is this a blocker or were you able to proceed with the task?

palash018 commented 3 days ago

@dhirenmathur it is actually a blocker since I cannot even run the code itself due to this issue. issue presents itself in ./start.sh itself

dhirenmathur commented 3 days ago

Got it, I'll try to reproduce and get back to you