langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
11.81k stars 1.98k forks source link

Cannot receive threadId if use OpenAI assistant as agent #3520

Open dkhizhniakov opened 7 months ago

dkhizhniakov commented 7 months ago

Hi, I'm using the latest langchain version 0.0.200 with node 18.17.1 and I face an issue trying to make OpenAI assistant to memorize the context. I want to make calls for the existing OpenAI assistant and track those within langsmith, but I can't get it to work.

So I tried 2 approaches - first using AgentExecutor

        const assistant = new OpenAIAssistantRunnable({
          assistantId: process.env.OPENAI_ASSISTANT_ID,
          asAgent: true,
        });

        const agentExecutor = AgentExecutor.fromAgentAndTools({
          agent: assistant,
          tools: [],
          verbose: true,
        });
        const assistantResponse = await agentExecutor.invoke(
          {
            content: req.body.message,
          },
          {
            metadata: {
              userId,
            },
          }
        );

In this case - I can see my request tracked in langsmith project (I have process.env set as per documentation), but I don't get threadId in the response and, therefore, assistant seem to not have a conversation context (I tried to ask it to remember a number and then tell it to me - works in OpenAI playground, but not in my implementation).

Another way I tried is to call directly like this:

        const assistant = new OpenAIAssistantRunnable({
          assistantId: process.env.OPENAI_ASSISTANT_ID,
          asAgent: true,
        });

        const assistantResponse = await assistant.invoke(
          {
            content: req.body.message,
          },
          {
            metadata: {
              userId,
            },
          }
        );

In this case I have threadId from assistantResponse and can use it just fine, but nothing shows up on langsmith (using the same process.env). I also tried to initialize tracking directly using langsmith/Client and LangChainTracer, but it didn't work either.

Small note: TS tells me that Client from langsmith package (0.0.49) isn't assignable to client property of LangChainTracer (from langchain/callbacks v.0.0.200): image

So is there a way to make requests to OpenAI assistant without losing the context and with having tracing in langsmith?

dosubot[bot] commented 7 months ago

πŸ€–

Hello @dkhizhniakov!

I'm Dosu, a bot here to assist you. I can help you troubleshoot issues, answer your questions, and guide you on becoming a contributor. Let's work together on this while we wait for a human maintainer to join us. How can I assist you today?

It seems like you're facing a couple of issues here. Let's try to address them one by one.

Firstly, regarding the threadId not being returned when using AgentExecutor, it's important to note that the AgentExecutor class is designed to handle the execution of an agent and its tools. It doesn't directly manage the conversation context or thread management. This is handled by the OpenAIAssistantRunnable class, which AgentExecutor uses internally.

The OpenAIAssistantRunnable class maintains the conversation context by keeping track of the conversation thread. If there's no threadId in the input, it creates a new thread and run. If there's a threadId but no runId, it creates a new message in the thread and a new run. If both threadId and runId are present, it submits tool outputs to the existing run. After these operations, it gets the response from the run.

Here is the relevant code from the invoke method of the OpenAIAssistantRunnable class:

async invoke(
    input: RunInput,
    _options?: RunnableConfig
  ): Promise<ExtractRunOutput<AsAgent>> {
    let run: OpenAIClient.Beta.Threads.Run;
    if (this.asAgent && input.steps && input.steps.length > 0) {
      const parsedStepsInput = await this._parseStepsInput(input);
      run = await this.client.beta.threads.runs.submitToolOutputs(
        parsedStepsInput.threadId,
        parsedStepsInput.runId,
        {
          tool_outputs: parsedStepsInput.toolOutputs,
        }
      );
    } else if (!("threadId" in input)) {
      const thread = {
        messages: [
          {
            role: "user",
            content: input.content,
            file_ids: input.fileIds,
            metadata: input.messagesMetadata,
          },
        ],
        metadata: input.threadMetadata,
      };
      run = await this._createThreadAndRun({
        ...input,
        thread,
      });
    } else if (!("runId" in input)) {
      await this.client.beta.threads.messages.create(input.threadId, {
        content: input.content,
        role: "user",
        file_ids: input.file_ids,
        metadata: input.messagesMetadata,
      });
      run = await this._createRun(input);
    } else {
      // Submitting tool outputs to an existing run, outside the AgentExecutor
      // framework.
      run = await this.client.beta.threads.runs.submitToolOutputs(
        input.runId,
        input.threadId,
        {
          tool_outputs: input.toolOutputs,
        }
      );
    }

    return this._getResponse(run.id, run.thread_id);
  }

You can find this code in the langchain/src/experimental/openai_assistant/index.ts file.

Regarding the issue with the Client from the langsmith package not being assignable to the client property of LangChainTracer, it seems like there might be a version mismatch between the langsmith package and the langchain/callbacks package. Could you please ensure that you're using the latest versions of both packages?

If you're still facing issues, could you please provide more details about the error messages you're seeing and the steps you're taking to reproduce the issues? This will help me better understand the problems and provide more accurate solutions.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dkhizhniakov commented 7 months ago

Unfortunately, it doesn't help. I can receive threadId from the call like this, but it's not tracked in langsmith.

AdelDima commented 5 months ago

Do you find any solution?

dkhizhniakov commented 2 months ago

No, I ended up using OpenAI SDK directly + track into Langsmith using the client. Maybe something has changed now