Open neubig opened 4 days ago
Didn't litellm just add Vertex support the other day? Maybe it'd already work now?
LiteLLM has supported VertexAI for a while now, and I think that the doc I linked is basically relying on LiteLLM's environment variables (probably). But It'd be nice to also be able to use config.toml
to set them as well.
LiteLLM has supported VertexAI for a while now, and I think that the doc I linked is basically relying on LiteLLM's environment variables (probably). But It'd be nice to also be able to use
config.toml
to set them as well.
Oh, I didn't connect the dots, sorry. litellm added some related json feature to their Vertex support.
What problem or use case are you trying to solve?
I would like to be able to easily use VertexAI to access Gemini or Anthropic.
Describe the UX of the solution you'd like
Currently I believe it is possible to access VertexAI through environment variables, as specified here: https://docs.all-hands.dev/modules/usage/llms/googleLLMs
However, it is not possible to access them through config.toml, despite the fact that many other parameters can be specified there (including the
llm_api_key
). It would be nice if this could be specified through the config file as well!