Closed anuj203 closed 8 months ago
You say it's not working, but there's no stacktrace. What happens if you try this?
async for chunk in response:
print(chunk)
And does it work if you remove stream=True
?
it is working when we use model_dump_json instead of directly print(chunk) with stream=True
async for chunk in response:
print(chunk .model_dump_json(indent=2))
you can close the issue.
Thanks
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
Cell In[8], line 36
32 async for chunk in response:
33 print(chunk .model_dump_json(indent=2))
---> 36 asyncio.run(get_response('What is chatgpt?'))
File /opt/homebrew/Cellar/python@3.10/3.10.13_2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/runners.py:33, in run(main, debug)
9 """Execute the coroutine and return the result.
10
11 This function runs the passed coroutine, taking care of
(...)
30 asyncio.run(main())
31 """
32 if events._get_running_loop() is not None:
---> 33 raise RuntimeError(
34 "asyncio.run() cannot be called from a running event loop")
36 if not coroutines.iscoroutine(main):
37 raise ValueError("a coroutine was expected, got {!r}".format(main))
RuntimeError: asyncio.run() cannot be called from a running event loop
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
import os import openai import asyncio from openai import AzureOpenAI, AsyncAzureOpenAI
import os import asyncio from openai import AsyncAzureOpenAI
azure_openai_client = AsyncAzureOpenAI( azure_endpoint = "", api_key="some-key",
api_version="2023-07-01-preview" )
async def get_response(message): response = await azure_openai_client.chat.completions.create( model = 'GPT35', temperature = 0.4, messages = [ {"role": "user", "content": message} ], stream=True )
print(response.model_dump_json(indent=2)) - > no response
asyncio.run(get_response('What is chatgpt?'))
To Reproduce
import os import openai import asyncio from openai import AzureOpenAI, AsyncAzureOpenAI
import os import asyncio from openai import AsyncAzureOpenAI
azure_openai_client = AsyncAzureOpenAI( azure_endpoint = "", api_key="some-key",
api_version="2023-07-01-preview" )
async def get_response(message): response = await azure_openai_client.chat.completions.create( model = 'GPT35', temperature = 0.4, messages = [ {"role": "user", "content": message} ], stream=True )
print(response.model_dump_json(indent=2)) - > not response
asyncio.run(get_response('What is chatgpt?'))
Code snippets
OS
Windows 10 Enterprise
Python version
Python 3.8.10
Library version
1.3.9 or 1.6.0