anarchy-ai / LLM-VM

irresponsible innovation. Try now at https://chat.dev/
https://anarchy.ai/
MIT License
483 stars 144 forks source link

Vertex AI (Google) API based completion #180

Open INF800 opened 1 year ago

INF800 commented 1 year ago

Two models are available - text completion and chat models based on PaLM.

Docs: https://cloud.google.com/vertex-ai/docs/generative-ai/start/quickstarts/api-quickstart#summarization. google-cloud-aiplatform from pypi provides necessary helpers.

This will also make sure we are not using openai ONLY in llm-vm.

mmirman commented 1 year ago

We do use local models, but I like this yeah we should work with more endpoints

BlackReaper333 commented 1 year ago

Vertex AI requires billing to be enabled on the user end. How do we plan to handle that?

sareensumanau commented 1 year ago

Happy to pick this up. If I understand correctly, we have to define call_vertexai in utils under rebel agent, and make changes accordingly. Allow user to either share open AI auth or google auth. And accordingly handle code changes in the files.

Any other change required for this feature ? @mmirman