jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
6.56k stars 519 forks source link

Add support for Gemini API #441

Closed AmgadHasan closed 4 weeks ago

AmgadHasan commented 4 months ago

The new Gemini api introduced support for function calling. You define a set of functions with their expected arguments and you pass them in the tools argument.

Can we add gemini support to instructor so it can be used with gemini pro models and later gemini ultra?

jxnl commented 4 months ago

can you link docs and a more complete example?

AmgadHasan commented 4 months ago

can you link docs and a more complete example?

@jxnl Sure. Just as a heads up, there are two versions of the gemini API:

Here are the docs for function calling usinf the Google ai dev api

https://ai.google.dev/tutorials/function_calling_python_quickstart

A quick example:

calculator = glm.Tool(
    function_declarations=[
      glm.FunctionDeclaration(
        name='add',
        description="Returns the sum of two numbers.",
        parameters=glm.Schema(
            type=glm.Type.OBJECT,
            properties={
                'a': glm.Schema(type=glm.Type.NUMBER),
                'b': glm.Schema(type=glm.Type.NUMBER)
            },
            required=['a','b']
        )
      ),
      glm.FunctionDeclaration(
        name='multiply',
        description="Returns the product of two numbers.",
        parameters=glm.Schema(
            type=glm.Type.OBJECT,
            properties={
                'a':glm.Schema(type=glm.Type.NUMBER),
                'b':glm.Schema(type=glm.Type.NUMBER)
            },
            required=['a','b']
        )
      )
    ])
model = genai.GenerativeModel('gemini-pro', tools=[calculator])
chat = model.start_chat()

response = chat.send_message(
    f"What's {a} X {b} ?",
) 
response.candidates
[index: 0
content {
  parts {
    function_call {
      name: "multiply"
      args {
        fields {
          key: "b"
          value {
            number_value: 234234
          }
        }
        fields {
          key: "a"
          value {
            number_value: 2312371
          }
        }
      }
    }
  }
  role: "model"
}
finish_reason: STOP
] 
jxnl commented 4 months ago

lol WHY WOULD THEY DO THIS

AmgadHasan commented 4 months ago

lol WHY WOULD THEY DO THIS

I think they want indie developers to hack with thei googleai dev api while also offering their gcp to their enterprise customers.

Just like openai and azure

AmgadHasan commented 4 months ago

@jxnl

So is there any hope for this?

jxnl commented 4 months ago

Def hope!

The nuance is that the api is slightly different .chat() and .send_message()

jxnl commented 4 months ago

So I want to think about how to give a good matching experience.

AmgadHasan commented 4 months ago

Yes, it's definitely annoying. You have to specify the tools when you create the model object, not when you actually send the prompt.

Like why even do this? LLMs are fucking stateless; no point in defining the functions when creating the model object. Unless they're using something completely different from OpenAI.

AmgadHasan commented 4 months ago

Okay so after taking a second deep dive in their not-so-great docs, I think there's a different way for doing this:

# Create a client
from google.ai import generativelanguage as glm

client = glm.GenerativeServiceClient(
    client_options={'api_key': GOOGLE_API_KEY})

# Create function tools
my_tool = glm.Tool(
    function_declarations=[
      glm.FunctionDeclaration(
        name='add',
        description="Returns the sum of two numbers.",
        parameters=glm.Schema(
            type=glm.Type.OBJECT,
            properties={
                'a': glm.Schema(type=glm.Type.NUMBER),
                'b': glm.Schema(type=glm.Type.NUMBER)
            },
            required=['a','b']
        )
      ),
      glm.FunctionDeclaration(
        name='multiply',
        description="Returns the product of two numbers.",
        parameters=glm.Schema(
            type=glm.Type.OBJECT,
            properties={
                'a':glm.Schema(type=glm.Type.NUMBER),
                'b':glm.Schema(type=glm.Type.NUMBER)
            },
            required=['a','b']
        )
      )
    ])

request = {
    "model": 'models/gemini-1.0-pro-001',
    "contents": [{"parts": [{"text": "Send an email to my friend Oliver wishing them a happt birthday"}], "role": "user"}],
    "tools": [my_tool],

}
response = client.generate_content(request=request)

This is somewhat similar to the conventional way of doing function calling using OpenAI's client. @jxnl

ankrgyl commented 4 months ago

@AmgadHasan I maintain https://github.com/braintrustdata/braintrust-proxy which allows you to access gemini models through the OpenAI format. We haven't yet translated the gemini tool call syntax over, but based on your code snippets, my guess is that it is just sending json-schema and should be easy to do.

Want to collaborate on that? Then, you could just set OPENAI_BASE_URL to "https://braintrustproxy.com/v1" and it'll work out of the box with instructor

davhin commented 3 months ago

what is the current state on this? happy to contribute!

jxnl commented 3 months ago

o work done would love a contrib

oliverbj commented 2 months ago

Hi!

What's the update on this? :)

jxnl commented 2 months ago

Hi!

What's the update on this? :)

I won't be working on this anytime soon, so it'll likley have to be from another contributor unless you go tough litellm or braintrust proxy.

M-Gonzalo commented 1 month ago

Can we get Gemini to implement itself here? I'm only half-joking BTW lol I read it's got a 1.000.000 token window so it can probably read the whole project in one go, and it understands Elixir...

ssonal commented 1 month ago

Managed to implement gemini support here. Not 100% compatible with all of instructor's concept given the mismatch in API design but it's a start. Would love some feedback.