openai / openai-python

The official Python library for the OpenAI API
https://pypi.org/project/openai/
Apache License 2.0
22.41k stars 3.12k forks source link

Stream Parameter Issue in Async Thread Creation #1252

Closed JLJuradoDeloitte closed 6 months ago

JLJuradoDeloitte commented 6 months ago

Confirm this is an issue with the Python library and not an underlying OpenAI API

Describe the bug

The bug arises when attempting to create a thread in the OpenAI Python library using asynchronous functions. It seems that the issue lies in passing the stream parameter, which is unrecognized, causing a BadRequestError with error code 400. This error indicates that the OpenAI API does not recognize the stream parameter as a valid parameter for thread creation. As a result, the thread creation process fails, leading to the generation of a BadRequestError and preventing the thread from being created successfully.

To Reproduce

To reproduce the behavior:

  1. Set up the OpenAI Python library and ensure that all necessary dependencies are installed.
  2. Copy the provided code into a Python script or notebook.
  3. Ensure that environment variables such as API_VERSION, AZURE_ENDPOINT, and AZURE_OPENAI_KEY are correctly set with appropriate values.
  4. Execute the script or notebook.
  5. Upon execution, the script will attempt to create a thread with specified messages and instructions.
  6. The bug occurs during the creation of the thread when the stream parameter is passed, causing the BadRequestError with error code 400 to be raised.
  7. The error traceback will indicate that the stream parameter is unrecognized, leading to the failure of the thread creation process.

Code snippets

from __future__ import annotations

from typing_extensions import override

import openai
from openai import AssistantEventHandler
from openai.types.beta import AssistantStreamEvent
from openai.types.beta.threads import Text, TextDelta
from openai.types.beta.threads.runs import RunStep, RunStepDelta
import os
from dotenv import load_dotenv

load_dotenv()

class EventHandler(AssistantEventHandler):
    @override
    def on_event(self, event: AssistantStreamEvent) -> None:
        if event.event == "thread.run.step.created":
            details = event.data.step_details
            if details.type == "tool_calls":
                print("Generating code to interpret:\n\n")
        elif event.event == "thread.message.created":
            print("\nResponse:\n")

    @override
    def on_text_delta(self, delta: TextDelta, snapshot: Text) -> None:
        print(delta.value, end="", flush=True)

    @override
    def on_run_step_done(self, run_step: RunStep) -> None:
        details = run_step.step_details
        if details.type == "tool_calls":
            for tool in details.tool_calls:
                if tool.type == "code_interpreter":
                    print("\n\nExecuting code...")

    @override
    def on_run_step_delta(self, delta: RunStepDelta, snapshot: RunStep) -> None:
        details = delta.step_details
        if details is not None and details.type == "tool_calls":
            for tool in details.tool_calls or []:
                if tool.type == "code_interpreter" and tool.code_interpreter and tool.code_interpreter.input:
                    print(tool.code_interpreter.input, end="", flush=True)

async def main() -> None:
    client = openai.AsyncAzureOpenAI(
        api_version=os.getenv("API_VERSION"),
        azure_endpoint=os.getenv("AZURE_ENDPOINT"),
        api_key=os.getenv("AZURE_OPENAI_KEY"),
    )

    assistant = await client.beta.assistants.create(
        name="Math Tutor",
        instructions="You are a personal math tutor. Write and run code to answer math questions.",
        tools=[{"type": "code_interpreter"}],
        model="gpt-4-1106-preview",
    )

    try:
        question = "I need to solve the equation `3x + 11 = 14`. Can you help me?"

        thread = await client.beta.threads.create(
            messages=[
                {
                    "role": "user",
                    "content": question,
                },
            ]
        )
        print(f"Question: {question}\n")

        async with client.beta.threads.runs.create_and_stream(
            thread_id=thread.id,
            assistant_id=assistant.id,
            instructions="Please address the user as Jane Doe. The user has a premium account.",
            event_handler=EventHandler(),
        ) as stream:
            stream.until_done()
            print()
    finally:
        client.beta.assistants.delete(assistant.id)

await main()

OS

windows

Python version

Python v3.11.2

Library version

openai v1.14.1

kristapratico commented 6 months ago

Hey @JLJuradoDeloitte, the Azure OpenAI service does not support streaming with assistants yet. This is why you're seeing the 400 and the "stream parameter is unrecognized" error. You can follow the "what's new" page on the MS Learn docs for any updates on its support: https://learn.microsoft.com/en-us/azure/ai-services/openai/whats-new