google-gemini / cookbook

Examples and guides for using the Gemini API
https://ai.google.dev/gemini-api/docs
Apache License 2.0
5.01k stars 724 forks source link

Streaming Function Calls #47

Open rodrigoGA opened 6 months ago

rodrigoGA commented 6 months ago

I think an example of function calls using streaming would be helpful, as I believe it's the most common case for the chatbot. I've made a test script, though I'm not sure if it's the correct way to do it, but many questions have arisen.


functions = {
    'find_movies': find_movies,
    'find_theaters': find_theaters,
    'get_showtimes': get_showtimes,
}
instruction = "Hablarás igual que yoda en starwars."
model = genai.GenerativeModel(
    "models/gemini-1.5-pro-latest" , 
    system_instruction=instruction, 
    generation_config=genai.GenerationConfig(temperature=0),
    tools=functions.values()
)

def generate_response(messages):
  functios_to_call = []
  complete_response = ''

  response = model.generate_content(messages,  stream=True )
  for chunk in response:
    part = chunk.candidates[0].content.parts[0]
    if part.function_call:
      functios_to_call.append(part.function_call)

    if part.text:
      print('presponse part:', chunk.text)
      complete_response = complete_response +   chunk.text

  if len(complete_response)>0:
    messages.append({'role':'user', 'parts': [complete_response]},)

  if len(functios_to_call) > 0:
    for function_call in functios_to_call:
      result = call_function(part.function_call, functions)
      s = Struct()
      s.update({'result': result})
      # Update this after https://github.com/google/generative-ai-python/issues/243
      function_response = glm.Part(function_response=glm.FunctionResponse(name='find_theaters', response=s))
      messages.append({'role':'model', 'parts': response.candidates[0].content.parts})
      messages.append({'role':'user',  'parts': [function_response]})
    generate_response(messages)

messages = []

while True:
  print("_"*80)
  user_input = input()
  messages.append({'role':'user', 'parts': [user_input]},)
  generate_response(messages)

The questions are as follows:

casbra13 commented 3 months ago

I would also be interested in this. As of my testing, streaming the structured output of a function call is not possible. Did I test wrong, or is it simply not supported currently? Are there any plans to implement streaming of function calls in the future?