run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.86k stars 5.09k forks source link

[Bug]: AttributeError: IN class class OpenAIAgentWorker(BaseAgentWorker) #14315

Closed navneteberi30 closed 5 days ago

navneteberi30 commented 3 months ago

Bug Description

There is an AttributeError in the _get_async_stream_ai_response method of the StreamingAgentChatResponse class in the OpenAIAgent package from llamaindex. The attribute _is_function_false_event does not exist in the class, causing the error. The correct attribute name should be is_function_false_event.

Version

0.10.48.post1

Steps to Reproduce

Implement this method agent = OpenAIAgent.from_tools([Tool1, Tool1], llm=Settings.llm, max_function_calls=10, system_prompt=system_prompt, verbose=True)

Affection method of class OpenAIAgentWorker(BaseAgentWorker):

from llama_index.agent.openai import OpenAIAgent

async def _get_async_stream_ai_response( self, task: Task, llm_chat_kwargs: Any ) -> StreamingAgentChatResponse: chat_stream_response = StreamingAgentChatResponse( achat_stream=await self._llm.astream_chat(llm_chat_kwargs), sources=task.extra_state["sources"], )

create task to write chat response to history

asyncio.create_task(
    chat_stream_response.awrite_response_to_history(
        task.extra_state["new_memory"],
        on_stream_end_fn=partial(self.finalize_task, task),
    )
)
chat_stream_response._ensure_async_setup()

# wait until openAI functions stop executing
await chat_stream_response._is_function_false_event.wait()  # Issue is here

# return response stream
return chat_stream_response

As in streaming agent response _is_function_false_event doesn't exist it should be changes to is_function_false_event

Relevant Logs/Tracbacks

AttributeError: 'StreamingAgentChatResponse' object has no attribute '_is_function_false_event'
dosubot[bot] commented 3 months ago

To fix the AttributeError in the _get_async_stream_ai_response method of the StreamingAgentChatResponse class in the OpenAIAgent package from llamaindex, you need to change the attribute _is_function_false_event to is_function_false_event.

Here is the corrected method:

async def _get_async_stream_ai_response(
    self, task: Task, **llm_chat_kwargs: Any
) -> StreamingAgentChatResponse:
    chat_stream_response = StreamingAgentChatResponse(
        achat_stream=await self._llm.astream_chat(**llm_chat_kwargs),
        sources=task.extra_state["sources"],
    )
    # create task to write chat response to history
    asyncio.create_task(
        chat_stream_response.awrite_response_to_history(
            task.extra_state["new_memory"],
            on_stream_end_fn=partial(self.finalize_task, task),
        )
    )
    chat_stream_response._ensure_async_setup()

    # wait until openAI functions stop executing
    await chat_stream_response.is_function_false_event.wait()  # Corrected attribute

    # return response stream
    return chat_stream_response

This change ensures that the correct attribute name is_function_false_event is used instead of _is_function_false_event [1][2][3][4].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 3 months ago

@navneteberi30 it works fine for me. Maybe update

pip install -U llama-index-core llama-index-agent-openai