langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.3k stars 2.08k forks source link

Getting ERR_PACKAGE_PATH_NOT_EXPORTED]: No "exports" main defined #185

Closed liorshk closed 1 year ago

liorshk commented 1 year ago

We are getting

Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: No "exports" main defined in langchain\package.json It started happening from versions 0.0.13 and up. It works fine for 0.0.12.

evad1n commented 1 year ago

Same issue as well. The exports in the package.json have switched from

    "./embeddings": {
      "import": "./embeddings.mjs",
      "default": "./embeddings.js"
    },

to

    "./embeddings": {
      "types": "./embeddings.d.ts",
      "import": "./embeddings.js"
    },
nfcampos commented 1 year ago

The langchain package is now ESM only. You can fix that by adding "type": "module" to your package.json. You can find more information here https://hwchase17.github.io/langchainjs/docs/getting-started/

irl-dan commented 1 year ago

The langchain package is now ESM only. You can fix that by adding "type": "module" to your package.json. You can find more information here https://hwchase17.github.io/langchainjs/docs/getting-started/

@nfcampos I'm not sure I understand why this requirement exists. Is this on purpose or will this be changed? Happy to open up a PR with the fix if that would be accepted.

nfcampos commented 1 year ago

@irl-dan we converted the package to ESM so that we can support other environments outside of Node, which are ESM only, discussion here https://github.com/hwchase17/langchainjs/discussions/152

You also have the option to use it without type: module by using dynamic import as mentioned here https://hwchase17.github.io/langchainjs/docs/getting-started/#commonjs-in-nodejs

sboudouk commented 1 year ago

Hello.

Using dynamic import on my side produce the same error.

const { OpenAI } = await import('langchain');

and

const { AnalyzeDocumentChain, loadSummarizationChain } = await import('langchain/chains');

tsconfig.json:

{
  "compilerOptions": {
    "module": "commonjs",
    "declaration": true,
    "removeComments": true,
    "emitDecoratorMetadata": true,
    "experimentalDecorators": true,
    "allowSyntheticDefaultImports": true,
    "target": "es2017",
    "sourceMap": true,
    "outDir": "./dist",
    "baseUrl": "./",
    "incremental": true,
    "skipLibCheck": true,
    "strictNullChecks": false,
    "noImplicitAny": false,
    "strictBindCallApply": false,
    "forceConsistentCasingInFileNames": false,
    "noFallthroughCasesInSwitch": false,
    "resolveJsonModule": true
  }
}

node version:

v18.12.1

Any idea ?

Note that changing es2017 to es2020 do not change anything to the problem.

nfcampos commented 1 year ago

Some more information on your issue here https://stackoverflow.com/questions/65265420/how-to-prevent-typescript-from-transpiling-dynamic-imports-into-require

A solution for you appears to be "moduleResolution": "node16", see here https://github.com/microsoft/TypeScript/issues/43329#issuecomment-1315512792

sboudouk commented 1 year ago

Thanks for the fast reply.

Doing so, with :

    const { OpenAI } = await import_('langchain');
    const { AnalyzeDocumentChain, loadSummarizationChain } = await import_('langchain/chains');
    const { RecursiveCharacterTextSplitter } = await import_('langchain/text_splitter');

    const text = this.speakerRecognitionPromptFormatting(
      JSON.parse((conversation as any).toString()).prediction,
      'ok',
    );

    // text is a string.

    const model = new OpenAI({
      temperature: 0,
      openAIApiKey: ...,
    });
    /** Load the summarization chain. */
    const chain = loadSummarizationChain(model);
    const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });
    const docs = await textSplitter.createDocuments([text]);
    const res = await chain.call({
      input_document: docs,
    });

Results in a Error: Document key input_documents not found. error. Also tried with AnalyzeDocumentChain from documentation snippet.

Do you want me to create a separate issue ?

nfcampos commented 1 year ago

I think the issue might be an s missing on input_document on the last line, should be input_documents

sboudouk commented 1 year ago

Yep it is. Thanks. Can I submit a PR on the documentation for this single simple correction ?

nfcampos commented 1 year ago

Yes, thank you

sboudouk commented 1 year ago

Unfortunately It might not be only a documentation issue, running :

    const text = this.speakerRecognitionPromptFormatting(
      JSON.parse((conversation as any).toString()).prediction,
      'oki',
    );
    console.log('text :', text);
    const model = new OpenAI({
      temperature: 0,
      openAIApiKey: ...,
    });
    /** Load the summarization chain. */
    const combineDocsChain = loadSummarizationChain(model);
    /** Pass this into the AnalyzeDocumentChain. */
    const chain = new AnalyzeDocumentChain({
      combineDocumentsChain: combineDocsChain,
    });
    const res = await chain.call({
      input_documents: text,
    });

Regarding input_documents or input_document with the s change I get different errors. By trying to match the code I get hit by TypeError: currentDocs.map is not a function, occuring in combine_docs_chain.js

I feel like I might be doing something wrong. Note that the text that i'm sending is just a string, not coming from any file.

nfcampos commented 1 year ago

Ah, input_documents is expected to be an array of Document, see eg the second code block in this page https://hwchase17.github.io/langchainjs/docs/modules/chains/summarization You can get an array of Documents from eg a TextSplitter, or a Loader

ztratar commented 1 year ago

When will langchain support CommonJS? Turning my backend package into a module is problematic for many other packages that rely on CommonJS patterns.

sboudouk commented 1 year ago

If it can helps someone running CommonJS (pretty common to be fair), you can import the langchains modules dynamically and without changing anything config related by using this import package

https://www.npmjs.com/package/@brillout/import

sethgw commented 1 year ago

I was struggling with this in a NextJS application. Fortunately, I'm in a monorepo environment and I was able to get it to work by creating a new "package" with type: "module and then having NextJS transpile the package.

My interface to langchain goes through the new package and NextJS is happy.

next.config.js

{
   transpilePackages: ["my-ai-package-w-langchain"],
};
tmgauss commented 1 year ago

I was able to get around by the following codes (w/o type: "module" in package.json):

// tsconfig.json

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "nodenext"
  }
}
// main.ts

const main = async () => {
  const { OpenAI } = await import('langchain');

  const llm = new OpenAI({
    openAIApiKey: '-----',
  });

  const res = await llm.call(
    'your prompt'
  );
  console.log(res);
};

main()
  .then(() => {
    console.log('[done]');
  })
  .catch((e) => {
    console.error(e);
  });
% ts-node path/to/main.ts
nsbradford commented 1 year ago

Just hitting this now, +1 to CommonJS support. The only workaround I found that "worked" is brillout, mentioned above

import { import_ } from '@brillout/import';

async function main() {
  const { OpenAI } = await import_('langchain');
  const model = new OpenAI({ openAIApiKey: process.env.OPENAI_API_KEY!, temperature: 0.9 });
  const res = await model.call(
    'What would be a good company name a company that makes colorful socks?',
  );
  console.log(res);
}

Granted, I cannot get it to actually send requests...

ReferenceError: Headers is not defined
    at createRequest (file:///Users/nickbradford/dev/node_modules/langchain/src/util/axios-fetch-adapter.js:273:19)
    at fetchAdapter (file:///Users/nickbradford/dev/node_modules/langchain/src/util/axios-fetch-adapter.js:196:19)
    at dispatchRequest (/Users/nickbradford/dev/node_modules/axios/lib/core/dispatchRequest.js:58:10)
    at Axios.request (/Users/nickbradford/dev/node_modules/axios/lib/core/Axios.js:108:15)
    at Function.wrap [as request] (/Users/nickbradford/dev/node_modules/axios/lib/helpers/bind.js:9:15)
    at /Users/nickbradford/dev/node_modules/openai/dist/common.js:149:22
    at /Users/nickbradford/dev/node_modules/openai/dist/api.js:1738:133
Penggeor commented 1 year ago

@nfcampos Encounter the same condition...

Penggeor commented 1 year ago

@nfcampos I found the reason is that we missing the polyfill to node-fetch:

import { import_ } from '@brillout/import'
import { Logger } from '@nestjs/common'

export const openaiTranslate = async (text: string, target: string[]) => {
  // Polyfill for node-fetch
  const  { Headers, Request, Response } = await import_('node-fetch')
  const fetch = await import_('node-fetch').then((mod) => mod.default)
  if (!globalThis.fetch) globalThis.fetch = fetch
  if (!globalThis.Headers) globalThis.Headers = Headers
  if (!globalThis.Request) globalThis.Request = Request
  if (!globalThis.Response) globalThis.Response = Response

  const { OpenAI } = await import_('langchain')
  const OPENAI_API_KEY = 'sk-xxx'
  const model = new OpenAI({ openAIApiKey: OPENAI_API_KEY, temperature: 0.9 })

  const res = await model.call(
    'What would be a good company name a company that makes colorful socks?'
  )

  Logger.log(res)

  return res
}
kdawgwilk commented 1 year ago

I think where this is confusing is that the docs say for commonjs modules to use async import but commonjs modules are not allowed to await calls at the top level of any module. This makes the docs feel a bit counterintuitive

dangell7 commented 1 year ago

@irl-dan we converted the package to ESM so that we can support other environments outside of Node, which are ESM only, discussion here #152

You also have the option to use it without type: module by using dynamic import as mentioned here https://hwchase17.github.io/langchainjs/docs/getting-started/#commonjs-in-nodejs

Page not found.

nfcampos commented 1 year ago

Updated link https://js.langchain.com/docs/getting-started/install

imrank1 commented 1 year ago

Would there be interest in exporting to commons and esm? I'd be interested in taking that on.

nfcampos commented 1 year ago

@imrank1 the reason we made the library ESM only is that allows us to use ESM-only dependencies. To add a commonjs export you'd have to review every existing dependency to confirm whether they are not ESM-only

imrank1 commented 1 year ago

I see thanks.

Curious if anyone here has a working example of nestjs to view. The options mentioned here don't work for me. When using the await method it breaks other areas of nestjs. I'll post what I have in a bit

nfcampos commented 1 year ago

@imrank1 see an example here https://github.com/sullivan-sean/chat-langchainjs/

Or maybe better to look at my branch updating it to latest langchain version https://github.com/nfcampos/chat-langchainjs/tree/nc/lc0037

imrank1 commented 1 year ago

@imrank1 see an example here https://github.com/sullivan-sean/chat-langchainjs/

Or maybe better to look at my branch updating it to latest langchain version https://github.com/nfcampos/chat-langchainjs/tree/nc/lc0037

Thanks, the first reference worked for me.

Your fork off of the latest threw an error for me. Followed the exact steps in README for both.

yarn && yarn ingest 
yarn install v1.22.19
[1/4] 🔍  Resolving packages...
[2/4] 🚚  Fetching packages...
[3/4] 🔗  Linking dependencies...
[4/4] 🔨  Building fresh packages...
✨  Done in 5.35s.
yarn run v1.22.19
$ tsx -r dotenv/config ingest.ts
Loader created.
Docs splitted.
Creating vector store...
/Users/imrankhawaja/chat-langchainjs/node_modules/langchain/dist/vectorstores/hnswlib.js:44
        return new HierarchicalNSW(args.space, args.numDimensions);
               ^

Error: Wrong space name, expected "l2" or "ip".
    at Function.getHierarchicalNSW (/Users/imrankhawaja/chat-langchainjs/node_modules/langchain/dist/vectorstores/hnswlib.js:44:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at HNSWLib.initIndex (/Users/imrankhawaja/chat-langchainjs/node_modules/langchain/dist/vectorstores/hnswlib.js:51:26)
    at HNSWLib.addVectors (/Users/imrankhawaja/chat-langchainjs/node_modules/langchain/dist/vectorstores/hnswlib.js:70:9)
    at Function.fromDocuments (/Users/imrankhawaja/chat-langchainjs/node_modules/langchain/dist/vectorstores/hnswlib.js:147:9)
    at run (/Users/imrankhawaja/chat-langchainjs/ingest.ts:77:23)
    at <anonymous> (/Users/imrankhawaja/chat-langchainjs/ingest.ts:82:3)

Let me know if should log an issue in your fork?

nfcampos commented 1 year ago

@imrank1 ah thanks for letting me know, I've pushed a commit fixing that. the version of hnswlib-node needed to be updated

imrank1 commented 1 year ago

@imrank1 ah thanks for letting me know, I've pushed a commit fixing that. the version of hnswlib-node needed to be updated

Verified it works!

brishin commented 1 year ago

Also found this problematic given that top level awaits aren't allowed. It prevents extending any of the base classes like below:

const { BaseDocumentLoader } = await import('langchain/document_loaders');

export default class SanitizeHtmlLoader extends BaseDocumentLoader {
}
nfcampos commented 1 year ago

Hi folks, we have an open PR adding CJS support #626. If you want to give it a try you can install the prerelease version with npm i langchain@next. Can you let me know it it works for you, before we announce for everyone?

imrank1 commented 1 year ago

@nfcampos Brilliant! I had to uninstall and reinstall d3-dsv (I think because of a version change), but this works well with the following in my tsconfig.json:

{
  "compilerOptions:" {
    // ...
    "module": "commonjs",
    "moduleResolution": "node",
    // ...
  }
}

Thanks a bunch, this makes life a lot easier.

I can confirm this worked for me as well. thanks!

nfcampos commented 1 year ago

Hi, we've released the new version 0.0.49 which adds a CJS build and fixes this issue.

ashburnham commented 1 year ago

I really can't get this to work. I'm using 0.0.50. It has issues with typescript it seems.

{
  "include": [
    "src/**/*",
  ],
  "compilerOptions": {
    "target": "es2020", 
    "module": "commonjs",         /* I need this for other modules */
    "outDir": "./dist",
    "strict": true, 
    "baseUrl": "./src",
    "skipLibCheck": true,        /* Remove this when langchain type check works. */
    "typeRoots": [
      "node_modules/@types"
    ], 
    "types": [
      "node",
      "jest"
    ],
    "lib":  [
      "es2020",
      "dom",
    ],=
    "esModuleInterop": true,      /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
    "inlineSourceMap": true,
    "resolveJsonModule": true,
    "moduleResolution": "node",
    "allowJs": true
  }
}

Anyone else have the same issues?

SantoshKumarRavi commented 10 months ago

@nfcampos i tried with langchain-0.0.49 as u mention. Path error solved but not able to use new packages. module not found Screenshot (1220)

TonyH0401 commented 3 months ago

Hello, I encountered this same error, I deleted my node_modules, package-lock.json; changed to "type": "module"; using npm i to re-download everything and it seem to be working.

suryakun commented 2 months ago

I added "type": "module" and got the same error like this

Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './document_loaders/base' is not defined by "exports" in /Users/suryasurakhman/Documents/langchain/node_modules/@langchain/core/package.json imported from /Users/suryasurakhman/Documents/langchain/node_modules/@langchain/community/node_modules/langchain/dist/document_loaders/base.js

Here is the simple code

import 'pdf-parse'
import { PDFLoader } from "@langchain/community/document_loaders/fs/pdf";
const loader = new PDFLoader('./file.pdf')

const docs = await loader.load();

console.log(docs.length)

Anyone have the same issue?

suryakun commented 2 months ago

Hello, I encountered this same error, I deleted my node_modules, package-lock.json; changed to "type": "module"; using npm i to re-download everything and it seem to be working.

Seems this doesn't work for me. I am using node v.22