OpenDevin / OpenDevin

🐚 OpenDevin: Code Less, Make More
https://docs.all-hands.dev/
MIT License
28.41k stars 3.25k forks source link

[Feature]: Allow specification of VertexAI params through config.toml #2723

Open neubig opened 4 days ago

neubig commented 4 days ago

What problem or use case are you trying to solve?

I would like to be able to easily use VertexAI to access Gemini or Anthropic.

Describe the UX of the solution you'd like

Currently I believe it is possible to access VertexAI through environment variables, as specified here: https://docs.all-hands.dev/modules/usage/llms/googleLLMs

However, it is not possible to access them through config.toml, despite the fact that many other parameters can be specified there (including the llm_api_key). It would be nice if this could be specified through the config file as well!

tobitege commented 4 days ago

Didn't litellm just add Vertex support the other day? Maybe it'd already work now?

neubig commented 4 days ago

LiteLLM has supported VertexAI for a while now, and I think that the doc I linked is basically relying on LiteLLM's environment variables (probably). But It'd be nice to also be able to use config.toml to set them as well.

tobitege commented 3 days ago

LiteLLM has supported VertexAI for a while now, and I think that the doc I linked is basically relying on LiteLLM's environment variables (probably). But It'd be nice to also be able to use config.toml to set them as well.

Oh, I didn't connect the dots, sorry. litellm added some related json feature to their Vertex support.