anarchy-ai / LLM-VM

irresponsible innovation. Try now at https://chat.dev/
https://anarchy.ai/
MIT License
465 stars 151 forks source link

Vertex AI (Google) API based completion #180

Open INF800 opened 10 months ago

INF800 commented 10 months ago

Two models are available - text completion and chat models based on PaLM.

Docs: https://cloud.google.com/vertex-ai/docs/generative-ai/start/quickstarts/api-quickstart#summarization. google-cloud-aiplatform from pypi provides necessary helpers.

This will also make sure we are not using openai ONLY in llm-vm.

mmirman commented 10 months ago

We do use local models, but I like this yeah we should work with more endpoints

BlackReaper333 commented 10 months ago

Vertex AI requires billing to be enabled on the user end. How do we plan to handle that?

sareensumanau commented 9 months ago

Happy to pick this up. If I understand correctly, we have to define call_vertexai in utils under rebel agent, and make changes accordingly. Allow user to either share open AI auth or google auth. And accordingly handle code changes in the files.

Any other change required for this feature ? @mmirman