instructor-ai / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
8.33k stars 662 forks source link

Streaming for Parallel tool execution #922

Open MayankShah1 opened 3 months ago

MayankShah1 commented 3 months ago

Is your feature request related to a problem? Please describe. I am interested in streaming responses for two parallel tool calls.

Describe the solution you'd like I am trying to develop a comparison tool for summary information across two entities. The information is retrieved from data for the two entities of interest in parallel and then should be streamed. Then the final comparison tool across these information retrieved should be streamed. The streaming is done to ensure minimum latency on the UI.

Describe alternatives you've considered I have tried leveraging the code for Parallel tools -

` from future import annotations

import openai import instructor

from typing import Iterable, Literal from pydantic import BaseModel from openai import OpenAI,AzureOpenAI

class Weather(BaseModel): location: str units: Literal["imperial", "metric"]

class GoogleSearch(BaseModel): query: str

client = AzureOpenAI( azure_endpoint= api_key=, api_version=, )

client = instructor.from_openai( client, mode=instructor.Mode.PARALLEL_TOOLS )

function_calls = client.chat.completions.create( model=, messages=[ {"role": "system", "content": "You must always use tools"}, { "role": "user", "content": "What is the weather in toronto and dallas and who won the super bowl?", }, ], response_model=Iterable[Weather | GoogleSearch],
stream=True )

for fc in function_calls: print(fc)

> location='Toronto' units='metric'

#> location='Dallas' units='imperial'
#> query='super bowl winner'

`

Additional context Add any other context or screenshots about the feature request here. Error : AssertionError: stream=True is not supported when using PARALLEL_TOOLS mode

jxnl commented 2 months ago

not a priority, would be open to a PR