langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.52k stars 2.13k forks source link

TimeoutError: The operation timed out #2743

Closed LeoMartinDev closed 8 months ago

LeoMartinDev commented 1 year ago

When using Ollama llm with document based chains, I receive a TimeoutError: The operation timed out after about 60 seconds. Any way to increase the timeout value ? I tried with something like that but without success:

const response = await chain.call({
  input_documents: documents,
  timeout: 1000 * 60 * 10, // 10 minutes
});
dosubot[bot] commented 1 year ago

🤖

Hello,

The TimeoutError: The operation timed out error you're encountering is likely due to the operation taking longer than the default timeout value. In the chain.call() function, you can indeed specify a timeout parameter to increase this value. However, the value you provide should be in milliseconds.

From your code, it seems like you're already doing this correctly:

const response = await chain.call({
  input_documents: documents,
  timeout: 1000 * 60 * 10, // 10 minutes
});

This should set the timeout to 10 minutes. If this isn't working, it's possible that there might be an issue with how the timeout is being handled in the chain.call() function.

The chain.call() function does support a timeout parameter. This is evident from the _separateRunnableConfigFromCallOptions method in the BaseLLM class, where it checks if the timeout option is provided and if so, it creates an AbortSignal with the timeout value. Here is the relevant code:

protected _separateRunnableConfigFromCallOptions(
  options?: Partial<CallOptions>
): [RunnableConfig, this["ParsedCallOptions"]] {
  const [runnableConfig, callOptions] =
    super._separateRunnableConfigFromCallOptions(options);
  if (callOptions?.timeout && !callOptions.signal) {
    callOptions.signal = AbortSignal.timeout(callOptions.timeout);
  }
  return [runnableConfig, callOptions as this["ParsedCallOptions"]];
}

This method is used in the generate and _streamIterator methods, which are in turn used by the call method. Therefore, the timeout parameter is supported in the call method.

If you're still encountering the issue, could you please provide more details about your setup and the exact steps you're taking when you encounter the error? This will help in diagnosing the problem.

Best regards, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

jacoblee93 commented 1 year ago

CC @jmorganca?

karat121 commented 1 year ago

Could you share a minimally producible code @LeoMartinDev ? The timeout might not be coming out of the AbortHandler based on the error message you've shown.

If you decrease the timeout to 10 ms you'll notice a different error ( Error: AbortError )

I've tried the following example for a very long doc and I don't get a timeout after ~2 minutes so I'm assuming the timeout is coming from some other place

import { loadQARefineChain } from "langchain/chains";
import { Ollama } from "langchain/llms/ollama";
import { TextLoader } from "langchain/document_loaders/fs/text";

const model = new Ollama({
  baseUrl: "http://localhost:11434", // Default value
  model: "llama2", // Default value
});

const chain = loadQARefineChain(model);
const loader = new TextLoader("./file.txt");
const docs = await loader.loadAndSplit();
const question = "What did the president say about Justice Breyer";

// Call the chain
const res = await chain.call({
  input_documents: docs,
  question,
});

console.log(res);
dosubot[bot] commented 8 months ago

Hi, @LeoMartinDev,

I'm helping the langchainjs team manage their backlog and am marking this issue as stale. From what I understand, you encountered a TimeoutError: The operation timed out when using Ollama llm with document based chains. You attempted to increase the timeout value using a specific code snippet, but it did not work. I provided a detailed response explaining how to set the timeout value and suggesting potential areas to investigate if the issue persists. Additionally, karat121 requested a minimally producible code from you to further investigate the issue.

Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself or the issue will be automatically closed in 7 days.

Thank you for your understanding and cooperation.