langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.29k stars 2.08k forks source link

while Running the LLM i am getting this **[llm/error] [1:llm:ChatOpenAI] [163.61s] LLM run errored with error: "Network Error"** #3858

Closed punit-mistry closed 8 months ago

punit-mistry commented 8 months ago

This is the Code

import { ChatOpenAI } from "langchain/chat_models/openai"; import { PineconeStore } from "langchain/vectorstores/pinecone"; import { ConversationalRetrievalQAChain } from "langchain/chains"; import { getVectorStore } from "./vector-store"; import { getPineconeClient } from "./pinecone-client"; import { formatChatHistory } from "./utils";

const CONDENSE_TEMPLATE = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.

Chat History: {chat_history} Follow Up Input: {question} Standalone question:`;

const QA_TEMPLATE = `You are an enthusiastic AI assistant. Use the following pieces of context to answer the question at the end. If you don't know the answer, just say you don't know. DO NOT try to make up an answer. If the question is not related to the context, politely respond that you are tuned to only answer questions that are related to the context.

{context}

Question: {question} Helpful answer in markdown:`;

function makeChain( vectorstore: PineconeStore, writer: WritableStreamDefaultWriter ) { // Create encoding to convert token (string) to Uint8Array const encoder = new TextEncoder();

// Create a TransformStream for writing the response as the tokens as generated // const writer = transformStream.writable.getWriter();

const streamingModel = new ChatOpenAI({ modelName: "gpt-3.5-turbo", streaming: true, temperature: 0, verbose: true, callbacks: [ { async handleLLMNewToken(token) { await writer.ready; await writer.write(encoder.encode(${token})); }, async handleLLMEnd() { console.log("LLM end called"); }, }, ], }); const nonStreamingModel = new ChatOpenAI({ modelName: "gpt-3.5-turbo", verbose: true, temperature: 0, });

const chain = ConversationalRetrievalQAChain.fromLLM( streamingModel, vectorstore.asRetriever(), { qaTemplate: QA_TEMPLATE, questionGeneratorTemplate: CONDENSE_TEMPLATE, returnSourceDocuments: true, //default 4 questionGeneratorChainOptions: { llm: nonStreamingModel, }, } ); return chain; }

type callChainArgs = { question: string; chatHistory: [string, string][]; transformStream: TransformStream; };

export async function callChain({ question, chatHistory, transformStream, }: callChainArgs) { try { // Open AI recommendation const sanitizedQuestion = question.trim().replaceAll("\n", " "); const pineconeClient = await getPineconeClient(); const vectorStore = await getVectorStore(pineconeClient);

// Create encoding to convert token (string) to Uint8Array
const encoder = new TextEncoder();
const writer = transformStream.writable.getWriter();
const chain = makeChain(vectorStore, writer);
const formattedChatHistory = formatChatHistory(chatHistory);

// Question using chat-history
// Reference https://js.langchain.com/docs/modules/chains/popular/chat_vector_db#externally-managed-memory
chain
  .call({
    question: sanitizedQuestion,
    chat_history: formattedChatHistory,
  })
  .then(async (res) => {
    const sourceDocuments = res?.sourceDocuments;
    const firstTwoDocuments = sourceDocuments.slice(0, 2);
    const pageContents = firstTwoDocuments.map(
      ({ pageContent }: { pageContent: string }) => pageContent
    );
    const stringifiedPageContents = JSON.stringify(pageContents);
    await writer.ready;
    await writer.write(encoder.encode("tokens-ended"));
    // Sending it in the next event-loop
    setTimeout(async () => {
      await writer.ready;
      await writer.write(encoder.encode(`${stringifiedPageContents}`));
      await writer.close();
    }, 100);
  });

// Return the readable stream
return transformStream?.readable;

} catch (e) { console.error(e); throw new Error("Call chain method failed to execute successfully!!"); } }

Package Json { "name": "langchain-text-ai", "version": "0.1.0", "private": true, "type": "module", "scripts": { "dev": "next dev", "build": "next build", "start": "next start", "lint": "next lint", "prepare:data": "tsx -r dotenv/config ./src/scripts/pinecone-prepare-docs.ts" }, "dependencies": { "@pinecone-database/pinecone": "^0.1.6", "@radix-ui/react-accordion": "^1.1.2", "@radix-ui/react-dropdown-menu": "^2.0.5", "@radix-ui/react-icons": "^1.3.0", "@radix-ui/react-scroll-area": "^1.0.4", "@radix-ui/react-slot": "^1.0.2", "@types/node": "20.4.9", "@types/react": "18.2.19", "@types/react-dom": "18.2.7", "autoprefixer": "10.4.14", "class-variance-authority": "^0.7.0", "clsx": "^2.0.0", "dotenv": "^16.3.1", "eslint": "8.46.0", "eslint-config-next": "13.4.13", "langchain": "^0.0.126", "lucide-react": "^0.265.0", "next": "13.4.12", "next-themes": "^0.2.1", "pdf-parse": "^1.1.1", "postcss": "8.4.27", "react": "18.2.0", "react-dom": "18.2.0", "react-markdown": "^8.0.7", "react-wrap-balancer": "^1.0.0", "tailwind-merge": "^1.14.0", "tailwindcss": "3.3.3", "tailwindcss-animate": "^1.0.6", "typescript": "5.1.6", "zod": "^3.21.4" }, "devDependencies": { "@types/pdf-parse": "^1.1.1", "tsx": "^3.12.7" } }

Error Message

[llm/start] [1:llm:ChatOpenAI] Entering LLM run with input: { "messages": [ [ { "lc": 1, "type": "constructor", "id": [ "langchain", "schema", "HumanMessage" ], "kwargs": { "content": "You are an enthusiastic AI assistant. Use the following pieces of context to answer the question at the end.\nIf you don't know the answer, just say you don't know. DO NOT try to make up an answer.\nIf the question is not related to the context, politely respond that you are tuned to only answer questions that are related to the context.\n\n\u0018\u0018\nFree eBooks at \nPlanet eBook.com\nChapter 3\nT\nhere was music from my neighbor’s house through the \nsummer nights. In his blue gardens men and girls came \nand went like moths among the whisperings and the cham\n-\npagne and the stars. At high tide in the afternoon I watched \nhis guests diving from the tower of his raft or taking the \nsun on the hot sand of his beach while his two motor-boats \nslit the waters of the Sound, drawing aquaplanes over cat\n-\naracts of foam. On week-ends his Rolls-Royce became an \nomnibus, bearing parties to and from the city, between \nnine in the morning and long past midnight, while his sta\n-\ntion wagon scampered like a brisk yellow bug to meet all \ntrains. And on Mondays eight servants including an extra \ngardener toiled all day with mops and scrubbing-brushes \nand hammers and garden-shears, repairing the ravages of \nthe night before.\nEvery Friday five crates of oranges and lemons arrived\n\n11\nFree eBooks at \nPlanet eBook.com\nthe whip and snap of the curtains and the groan of a pic\n-\nture on the wall. Then there was a boom as Tom Buchanan \nshut the rear windows and the caught wind died out about \nthe room and the curtains and the rugs and the two young \nwomen ballooned slowly to the floor.\nThe younger of the two was a stranger to me. She was \nextended full length at her end of the divan, completely \nmotionless and with her chin raised a little as if she were \nbalancing something on it which was quite likely to fall. If \nshe saw me out of the corner of her eyes she gave no hint of \nit—indeed, I was almost surprised into murmuring an apol\n-\nogy for having disturbed her by coming in.\nThe other girl, Daisy, made an attempt to rise—she \nleaned slightly forward with a conscientious expression—\nthen she laughed, an absurd, charming little laugh, and I \nlaughed too and came forward into the room.\n\n\u0018\u0018\nFree eBooks at \nPlanet eBook.com\nin the saucer of milk all afternoon. Meanwhile Tom brought \nout a bottle of whiskey from a locked bureau door.\nI have been drunk just twice in my life and the second \ntime was that afternoon so everything that happened has a \ndim hazy cast over it although until after eight o’clock the \napartment was full of cheerful sun. Sitting on Tom’s lap \nMrs. Wilson called up several people on the telephone; then \nthere were no cigarettes and I went out to buy some at the \ndrug store on the corner. When I came back they had disap\n-\npeared so I sat down discreetly in the living room and read \na chapter of ‘Simon Called Peter’—either it was terrible stuff \nor the whiskey distorted things because it didn’t make any \nsense to me.\nJust as Tom and Myrtle—after the first drink Mrs. Wil\n-\nson and I called each other by our first names—reappeared, \ncompany commenced to arrive at the apartment door.\n\n\u0018\u0018\nFree eBooks at \nPlanet eBook.com\n‘Oh, I’ve been in several things,’ he corrected himself. ‘I \nwas in the drug business and then I was in the oil business. \nBut I’m not in either one now.’ He looked at me with more \nattention. ‘Do you mean you’ve been thinking over what I \nproposed the other night?’\nBefore I could answer, Daisy came out of the house and \ntwo rows of brass buttons on her dress gleamed in the sun\n-\nlight.\n‘That huge place THERE?’ she cried pointing.\n‘Do you like it?’\n‘I love it, but I don’t see how you live there all alone.’\n‘I keep it always full of interesting people, night and day. \nPeople who do interesting things. Celebrated people.’\nInstead of taking the short cut along the Sound we went \ndown the road and entered by the big postern. With en\n-\nchanting murmurs Daisy admired this aspect or that of the \nfeudal silhouette against the sky, admired the gardens, the \nsparkling odor of jonquils and the frothy odor of hawthorn\n\nQuestion: summarize the pdf\nHelpful answer in markdown:", "additional_kwargs": {} } } ] ] } [llm/error] [1:llm:ChatOpenAI] [163.61s] LLM run errored with error: "Network Error"

dosubot[bot] commented 8 months ago

🤖

Hello @punit-mistry! Nice to meet you, I'm Dosu, a friendly bot here to lend a hand while you await a human maintainer. I'm all set to help you squash bugs, answer questions, and even guide you on contributing. So, don't hesitate to ask away whenever you hit a roadblock. Let's get started, shall we?

Based on the information you've provided, it seems like you're encountering a "Network Error" when executing the callChain function using the ChatOpenAI and PineconeStore from the "langchain" library version "0.0.126". This error appears to originate from the axios-fetch-adapter.js file.

There have been similar issues reported in the langchainjs repository. Here are a few that might be relevant to your situation:

  1. Next.js v13.4.2 and Azure Open AI - Network Error with Axios
  2. Network error in ConversationRetrievalQAChain
  3. Network Error in ConversationChain

The "Network Error" could be due to a variety of reasons. It could be that the request was aborted, the request timed out, or there was a network issue. Here's the relevant code from the axios-fetch-adapter.js file that handles these cases:

try {
  stageOne = await fetch(request);
} catch (e) {
  if (e && e.name === "AbortError") {
    return createError("Request aborted", config, "ECONNABORTED", request);
  }
  if (e && e.name === "TimeoutError") {
    return createError("Request timeout", config, "ECONNABORTED", request);
  }
  return createError("Network Error", config, "ERR_NETWORK", request);
}

To resolve this issue, you could try checking your network connection, ensuring the server you're making the request to is up and running, increasing the timeout limit if the request is taking too long to respond, or ensuring the request isn't being manually aborted from somewhere else in your code.

Unfortunately, I wasn't able to find any changes made to the axios-fetch-adapter.js file in the "langchain" library since version "0.0.126". However, it's worth noting that the latest version of langchainjs is Release 0.0.213. It might be beneficial to update to the latest version, as it could contain fixes for the issue you're experiencing.

If you need further assistance, please provide more details about your setup and the exact steps you're taking when the error occurs. This will help in diagnosing the problem more accurately.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

jacoblee93 commented 8 months ago

Sounds like the Dosu answer was helpful - please reopen if you have further questions.