AbanteAI / rawdog

Generate and auto-execute Python scripts in the cli
Apache License 2.0
1.79k stars 140 forks source link

How to replace OpenAi model with Goggle Gemini Pro #15

Closed Abhay-404 closed 9 months ago

ancs21 commented 9 months ago

hi @Abhay-404

LiteLLM doesn't support Gemini Pro in litellm completion, so you need to use a proxy mode, which means need to run a server.

So, inspiration this repo I also created another repo here (https://github.com/ancs21/dragala) that supports the Goggle Gemini Pro model by default, Google provides gemini-pro model for free so you can try it.

Thanks.

tikendraw commented 9 months ago

Use litellm

  1. run the local gemini server posing as openai

    Note: Export your api key as :

    export GEMINI_API_KEY=AIxxxxxx-xxxxxxxxxxxxxxxxxxxxxxx

    then

    litellm --model gemini/gemini-pro --port 8080 --debug
  2. Edit the .rawdog/config.yaml accordingly

    llm_api_key: AIxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxx
    llm_base_url: http://0.0.0.0:8080
    llm_custom_provider: openai
    llm_model: gemini-pro
jakethekoenig commented 9 months ago

I updated the readme so hopefully it will be easier for future users.