langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
11.74k stars 1.96k forks source link

Error : Cannot get final name for export 'getRuntime' of ./node_modules/openai/_shims/auto/runtime-node.mjs #5161

Open jiveshkalra opened 2 months ago

jiveshkalra commented 2 months ago

Checked other resources

Example Code

const chain = ConversationalRetrievalQAChain.fromLLM(
      model,
      vectorStore.asRetriever()
    );
    // Format the messages
    const formattedChatPrompt = await chatPrompt.formatMessages({
      text: question,
    });

    const res = await chain.invoke({
      question,
      chat_history: formattedChatPrompt,
    });

Error Message and Stack Trace (if applicable)


Failed to compile.

./node_modules/@langchain/openai/index.js + 66 modules
Cannot get final name for export 'getRuntime' of ./node_modules/openai/_shims/auto/runtime-node.mjs```

### Description

I am trying to use Langchain + NextJs for hosting my chatbot , and it works fine in the development server ,
But when I try to compile my code to host on the vercel server, I am getting the error

### System Info

Langchain info -> 
```elixir
langchain@0.1.34 | MIT | deps: 17 | versions: 265
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.1.34.tgz
.shasum: 3c28a6bde69c5f4ac3931376d34187de0e44e43a
.integrity: sha512-4TVH2mzhITrLPOeyfGucfqo1+360DJMl4+ItG56ApxVY1krO4hV0EkqcRLAxW06W6sIue8mIU70WbyhBYH68Ug==
.unpackedSize: 4.1 MB

dependencies:
@anthropic-ai/sdk: ^0.9.1     binary-extensions: ^2.2.0     langchainhub: ~0.0.8          p-retry: 4                    zod: ^3.22.4
@langchain/community: ~0.0.47 js-tiktoken: ^1.0.7           langsmith: ~0.1.7             uuid: ^9.0.0
@langchain/core: ~0.1.56      js-yaml: ^4.1.0               ml-distance: ^4.0.0           yaml: ^2.2.1
@langchain/openai: ~0.0.28    jsonpointer: ^5.0.1           openapi-types: ^12.1.3        zod-to-json-schema: ^3.22.3   

maintainers:
- nfcampos <nuno@boringbits.io>
- jacoblee93 <jacoblee93@gmail.com>
- andrewnguonly <andrewnguonly@gmail.com>
- davidduong <david@duong.cz>
- hwchase17 <hw.chase.17@gmail.com>
- basproul <braceasproul@gmail.com>

dist-tags:
latest: 0.1.34     next: 0.1.29-rc.0  

published 2 days ago by jacoblee93 <jacoblee93@gmail.com>```

Platform -> windows 10 
Node -> v20.9.0
dosubot[bot] commented 2 months ago

The error you're encountering with getRuntime during the build process for deployment on Vercel seems to be related to how Next.js handles module resolution for the LangChain package. To address this issue, you can modify your Next.js Webpack configuration to improve the handling of these modules. Specifically, you can instruct Webpack not to enforce fully specified module resolution for the langchain package, which might be causing the issue with the getRuntime export. Here's how you can adjust your next.config.js:

const nextConfig = {
  webpack: (config, { isServer }) => {
    if (!isServer) {
      config.resolve.fullySpecified = false;
    }

    config.module.rules.push({
      test: /node_modules\/langchain/,
      resolve: {
        fullySpecified: false,
      },
    });

    return config;
  },
};

This configuration change tells Webpack to be more flexible with module resolution for the langchain package, potentially resolving the export issue you're facing.

Additionally, ensure all dependencies are compatible and up-to-date, and consider customizing the Next.js Webpack configuration to handle or transpile Node.js modules more effectively, especially if runtime-node.mjs is not essential for your application in the browser environment. Testing your application locally with next build and next start can help simulate the production environment before deploying to Vercel.

If these steps do not resolve the issue, providing more details about your setup and the steps leading to the error could help in diagnosing and solving the problem.

#### Details **Sources** - [Cannot get final name for export 'PromptTemplate' of ./node_modules/langchain/dist/prompts/index.js](https://github.com/langchain-ai/langchainjs/issues/2335) - [Make edge compatible version](https://github.com/langchain-ai/langchainjs/issues/212) - [Installation of 0.0.49 on next.js 13 requires installing all type of dependencies and issue 602 does not help](https://github.com/langchain-ai/langchainjs/issues/656) - [environment_tests/test-exports-vercel/package.json](https://github.com/langchain-ai/langchainjs/blob/main/environment_tests/test-exports-vercel/package.json) - [langchain-core/src/utils/tests/enviroment.test.ts](https://github.com/langchain-ai/langchainjs/blob/main/langchain-core/src/utils/tests/enviroment.test.ts) - [langchain-core/src/utils/env.ts](https://github.com/langchain-ai/langchainjs/blob/main/langchain-core/src/utils/env.ts)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

To continue the conversation, mention @dosu.

jiveshkalra commented 2 months ago

@dosu I tried doing this

/** @type {import('next').NextConfig} */
const nextConfig = {
  transpilePackages: ['html-to-text'],
  webpack: (config ) => { 
    config.externals = [...config.externals, "hnswlib-node","closevector-hnswlib-node","langchain","openai","@langchain", "@langchain/community","@langchain/groq"];

    return config;
  },
};

export default nextConfig;

and now my openai error is gone somehow but now new error comes of @langchain/core


<!--StartFragment-->
./node_modules/@langchain/openai/index.js + 15 modules
--
14:07:23.264 | Cannot get final name for export 'Tool' of ./node_modules/@langchain/core/tools.js
14:07:23.264 | Β 
14:07:23.269 | Β 
14:07:23.270 | > Build failed because of webpack errors

<!--EndFragment--> ```
and this doesnt seem to go away even if i add @langchain or something in my externals
dosubot[bot] commented 2 months ago

To address the new error with @langchain/core/tools.js, you might need to adjust your Webpack configuration in next.config.js. Here's a more targeted approach:

  1. Custom Webpack Configuration: Modify your next.config.js to mark @langchain/core/tools as external. This prevents Webpack from attempting to bundle this module, which seems to be causing the issue.

    // next.config.js
    module.exports = {
     webpack: (config, { isServer }) => {
       if (!isServer) {
         config.externals.push('@langchain/core/tools');
       }
       return config;
     },
    };
  2. Dynamic Imports with No SSR: If the module is client-side only, consider using Next.js's dynamic imports with SSR disabled for components that use @langchain/core/tools.

    import dynamic from 'next/dynamic';
    
    const ToolComponentWithNoSSR = dynamic(
     () => import('../path/to/your/ComponentUsingTool'),
     { ssr: false }
    );

Ensure your Langchain and Next.js versions are compatible and consult Langchain's documentation or support for any additional configuration steps required for Next.js and Vercel.

Additionally, a similar issue was resolved in a previous pull request which enabled support for various environments including Next.js. Reviewing this might provide further insights.

Let me know if this helps or if you encounter any more issues!

#### Details **Sources** - [Installation of 0.0.49 on next.js 13 requires installing all type of dependencies and issue 602 does not help](https://github.com/langchain-ai/langchainjs/issues/656) - [environment_tests/test-exports-vercel/src/entrypoints.js](https://github.com/langchain-ai/langchainjs/blob/main/environment_tests/test-exports-vercel/src/entrypoints.js) - [langchain-core/package.json](https://github.com/langchain-ai/langchainjs/blob/main/langchain-core/package.json)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

To continue the conversation, mention @dosu.