run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.26k stars 5.17k forks source link

[Bug]: openai assistant "from_existing" ignores Organization ID #11299

Closed benke-rl closed 4 months ago

benke-rl commented 8 months ago

Bug Description

Pretty sure this is a bug, but might be a feature request? If you have multiple OpenAI organizations, "from_existing" method on llama_index.agent.openai.OpenAIAssistantAgent currently depends on you having assistants in the default organization, because it ignores setting of the organization when initializing the API.

Version

0.10.11

Steps to Reproduce

Create a new organization with a new assistant in it in OpenAI. Try to reference that organization as follows:

client = openai.OpenAI(api_key=os.environ["OPENAI_API_KEY"],organization=os.environ["OPENAI_ORG_ID"])
agent = OpenAIAssistantAgent.from_existing(assistant_id="[replace with your assistant id]")

Relevant Logs/Tracbacks

Traceback (most recent call last):
  File "~/dev/ai-model-validation/main.py", line 38, in <module>
    agent = OpenAIAssistantAgent.from_existing(assistant_id=latest_assistant)
  File "~/dev/ai-model-validation/venv/lib/python3.9/site-packages/llama_index/agent/openai/openai_assistant_agent.py", line 280, in from_existing
    assistant = client.beta.assistants.retrieve(assistant_id)
  File "~/dev/ai-model-validation/venv/lib/python3.9/site-packages/openai/resources/beta/assistants/assistants.py", line 154, in retrieve
    return self._get(
  File "~/dev/ai-model-validation/venv/lib/python3.9/site-packages/openai/_base_client.py", line 1143, in get
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "~/dev/ai-model-validation/venv/lib/python3.9/site-packages/openai/_base_client.py", line 889, in request
    return self._request(
  File "~/dev/ai-model-validation/venv/lib/python3.9/site-packages/openai/_base_client.py", line 980, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': "No assistant found with id 'xxx'.", 'type': 'invalid_request_error', 'param': None, 'code': None}}

Process finished with exit code 1
dosubot[bot] commented 8 months ago

πŸ€–

Hey @benke-rl! Great to see you back, hope you've been doing awesome! πŸ˜„πŸ‘‹

Thank you for bringing this to our attention. You're correct, the current implementation of the from_existing method in the OpenAIAssistantAgent class does not take into account the organization ID when initializing the OpenAI client. This is indeed a bug and not a feature.

The OpenAI client is initialized with the provided API key, but no organization is specified. This means that the client will default to using the default organization associated with the API key, even if the assistant belongs to a different organization.

Here is the relevant code:

@classmethod
def from_existing(
    cls,
    assistant_id: str,
    tools: Optional[List[BaseTool]] = None,
    thread_id: Optional[str] = None,
    instructions_prefix: Optional[str] = None,
    run_retrieve_sleep_time: float = 0.1,
    callback_manager: Optional[CallbackManager] = None,
    api_key: Optional[str] = None,
    verbose: bool = False,
) -> "OpenAIAssistantAgent":
    """From existing assistant id.

    Args:
        assistant_id: id of assistant
        tools: list of BaseTools Assistant can use
        thread_id: thread id
        run_retrieve_sleep_time: run retrieve sleep time
        instructions_prefix: instructions prefix
        callback_manager: callback manager
        api_key: OpenAI API key
        verbose: verbose

    """
    from openai import OpenAI

    # initialize client
    client = OpenAI(api_key=api_key)

    # get assistant
    assistant = client.beta.assistants.retrieve(assistant_id)
    # assistant.tools is incompatible with BaseTools so have to pass from params

    return cls(
        client,
        assistant,
        tools=tools,
        callback_manager=callback_manager,
        thread_id=thread_id,
        instructions_prefix=instructions_prefix,
        run_retrieve_sleep_time=run_retrieve_sleep_time,
        verbose=verbose,
    )

As you can see, the OpenAI client is initialized with the provided API key (client = OpenAI(api_key=api_key)), but no organization is specified. This means that the client will default to using the default organization associated with the API key, even if the assistant belongs to a different organization.

This issue has been noted and will be addressed in a future update of the LlamaIndex. In the meantime, a possible workaround would be to manually specify the organization ID when initializing the OpenAI client. However, this would require modifying the from_existing method in the OpenAIAssistantAgent class, which may not be ideal.

We appreciate your patience and understanding as we work to resolve this issue. If you have any further questions or concerns, please don't hesitate to let us know.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.