withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

Error [ERR_REQUIRE_ESM]: require() of ES Module #58

Closed gbourgeat closed 11 months ago

gbourgeat commented 11 months ago

Issue description

Error for launch project with the lib

Expected Behavior

Using the lib in env with nestjs and langchainjs

Actual Behavior

/srv/app/node_modules/.pnpm/langchain@0.0.159_node-llama-cpp@2.5.1/node_modules/langchain/dist/llms/llama_cpp.cjs:4 2023-10-06T15:15:09.118856808Z const node_llama_cpp_1 = require("node-llama-cpp"); 2023-10-06T15:15:09.118858668Z ^ 2023-10-06T15:15:09.118860078Z Error [ERR_REQUIRE_ESM]: require() of ES Module /srv/app/node_modules/.pnpm/node-llama-cpp@2.5.1/node_modules/node-llama-cpp/dist/index.js from /srv/app/node_modules/.pnpm/langchain@0.0.159_node-llama-cpp@2.5.1/node_modules/langchain/dist/llms/llama_cpp.cjs not supported. 2023-10-06T15:15:09.118861714Z Instead change the require of index.js in /srv/app/node_modules/.pnpm/langchain@0.0.159_node-llama-cpp@2.5.1/node_modules/langchain/dist/llms/llama_cpp.cjs to a dynamic import() which is available in all CommonJS modules. 2023-10-06T15:15:09.118863275Z at Object. (/srv/app/node_modules/.pnpm/langchain@0.0.159_node-llama-cpp@2.5.1/node_modules/langchain/dist/llms/llama_cpp.cjs:4:26) 2023-10-06T15:15:09.118864924Z at Object. (/srv/app/node_modules/.pnpm/langchain@0.0.159_node-llama-cpp@2.5.1/node_modules/langchain/llms/llama_cpp.cjs:1:18) 2023-10-06T15:15:09.118866441Z at Object. (/srv/app/dist/services/predict/llama-2/llama2-7B-predict.service.js:5:21) 2023-10-06T15:15:09.118867854Z at Object. (/srv/app/dist/app.module.js:13:37) 2023-10-06T15:15:09.118869223Z at Object. (/srv/app/dist/main.js:4:22)

Steps to reproduce

Add the lib and try to use it in project with nestjs

My Environment

| Dependency | Version | "@nestjs/common": "^10.2.7", "@nestjs/core": "^10.2.7", "@nestjs/platform-express": "^10.2.7", "cmake-js": "^7.2.1", "langchain": "^0.0.159", "node-llama-cpp": "^2.5.1", "reflect-metadata": "^0.1.13", "rxjs": "^7.8.1" DEV "@nestjs/cli": "^10.1.18", "@nestjs/schematics": "^10.0.2", "@nestjs/testing": "^10.2.7", "@types/express": "^4.17.18", "@types/jest": "^29.5.5", "@types/node": "^20.8.2", "@types/supertest": "^2.0.14", "@typescript-eslint/eslint-plugin": "^6.7.4", "@typescript-eslint/parser": "^6.7.4", "eslint": "^8.50.0", "eslint-config-prettier": "^9.0.0", "eslint-plugin-prettier": "^5.0.0", "jest": "^29.7.0", "prettier": "^3.0.3", "source-map-support": "^0.5.21", "supertest": "^6.3.3", "ts-jest": "^29.1.1", "ts-loader": "^9.4.4", "ts-node": "^10.9.1", "tsconfig-paths": "^4.2.0", "typescript": "^5.2.2" | Operating System | ubuntu22.04 | | Node.js version | v20.8.0 | | Typescript version | 5.2.2 | | node-llama-cpp version | 2.5.1 |

Additional Context

No response

Relevant Features Used

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, but I don't know how to start. I would need guidance.

giladgd commented 11 months ago

@gbourgeat node-llama-cpp is an ESM module and thus doesn't support require like CommonJS modules. You can either transform your project to be an ESM module, or dynamically load node-llama-cpp like the following:

async function myLogic() {
    const {LlamaModel, LlamaContext, LlamaChatSession} = await import("node-llama-cpp");

    const model = new LlamaModel({ ... });
}

myLogic();
v4lentin1879 commented 10 months ago

This does not work for me. I get the following error message:

_"UnhandledPromiseRejectionWarning: Error [ERR_REQUIREESM]: require() of ES Module ... Instead change the require of index.js in ... to a dynamic import() which is available in all CommonJS modules."

I'm not sure whether I can change my whole project to be an ESM module since I'm using electron which doesn't quite support ESM. @giladgd Are there any other possibilities?

giladgd commented 10 months ago

@v4lentin1879 I think the issue you're facing is because you have a transpiler that transforms all of your import statements into require usages. Since the entire node ecosystem is moving to ESM, it is generally a good idea to switch to ESM.

To workaround this issue with your current setup, you can doing this instead:

async function myLogic() {
    const {LlamaModel, LlamaContext, LlamaChatSession} = await Function('return import("node-llama-cpp")')();

    const model = new LlamaModel({ ... });
}

myLogic();
v4lentin1879 commented 10 months ago

@giladgd What an easy solution. Spent quite some time on this yesterday. Thank you so much! :)

github-actions[bot] commented 10 months ago

:tada: This issue has been resolved in version 2.7.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket: