withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

support commonjs #100

Closed mainfraame closed 9 months ago

mainfraame commented 9 months ago

Feature Description

This library is currently integrated into langchain and because it only supports es modules, it is limiting both the adoption of langchain as well as this library in the ecosystem.

It's also important to note that you can't just switch your package.json for any nestjs project to module. most of the tooling and @nestjs modules themselves are not compatible.

The Solution

The easiest approach would be to modify the build and package json to support both commonjs and es modules builds. It might also be possible to support module only if you set the file extension correctly to .mjs

Considered Alternatives

There are no alternatives. In its current state, you cannot use the langchain or this library at all with nestjs. I've also tried a considerable amount trying workarounds for webpack and non-webpack builds with nestjs.

Additional Context

This issue was raised previously and the reason used to close it was because the node ecosystem is moving towards es modules. This is true, but it is still a long ways away. The node working group specifically created the ability to support both es modules and commonjs as a bridge to es modules. By not providing that other build, you're severally limiting the reach of this library as well as langchain.

Related Features to This Feature Request

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

mainfraame commented 9 months ago

Attempt to workaround this issue by including langchain and this library in webpack:

Screenshot 2023-11-18 at 5 48 03 PM

It still has compilation issues:

Screenshot 2023-11-18 at 5 47 41 PM
mainfraame commented 9 months ago

changing main to module in package.json causes this in webpack:

Screenshot 2023-11-18 at 5 52 28 PM
mainfraame commented 9 months ago

building with ts-loader:

[Nest] 67238  - 11/18/2023, 6:02:52 PM   ERROR [ExceptionHandler] node_llama_cpp_1.LlamaModel is not a constructor
TypeError: node_llama_cpp_1.LlamaModel is not a constructor
    at createLlamaModel (/Users/dayne/Sites/chat/packages/server/dist/webpack:/node_modules/langchain/dist/util/llama_cpp.cjs:13:1)
    at new LlamaCpp (/Users/dayne/Sites/chat/packages/server/dist/webpack:/node_modules/langchain/dist/llms/llama_cpp.cjs:71:1)
    at LlamaService.<instance_members_initializer> (/Users/dayne/Sites/chat/packages/server/dist/webpack:/@hermes/server/src/llama/llama.service.ts:27:11)
    at new LlamaService (/Users/dayne/Sites/chat/packages/server/dist/webpack:/@hermes/server/src/llama/llama.service.ts:26:8)
    at Injector.instantiateClass (/Users/dayne/Sites/chat/packages/server/dist/webpack:/node_modules/@nestjs/core/injector/injector.js:365:1)
    at callback (/Users/dayne/Sites/chat/packages/server/dist/webpack:/node_modules/@nestjs/core/injector/injector.js:65:1)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)

node_modules/langchain/dist/util/llama_cpp.cjs:

const node_llama_cpp_1 = require("node-llama-cpp");
function createLlamaModel(inputs) {
    const options = {
        gpuLayers: inputs?.gpuLayers,
        modelPath: inputs.modelPath,
        useMlock: inputs?.useMlock,
        useMmap: inputs?.useMmap,
        vocabOnly: inputs?.vocabOnly,
    };
    return new node_llama_cpp_1.LlamaModel(options);
}
giladgd commented 9 months ago

@mainfraame This module is only supported by nodejs and isn't meant to be used in the browser, so I wonder why you try to use it together with Webpack.

I do not plan to change this library to use CommonJS instead of ESM, as it'll make it much more complicated to maintain and develop, and ESM is the official way forward in nodejs.

Many modules already made the switch to ESM, and importing an ESM module from inside of a CommonJS module is a pain, so doing that for many modules is not something that I'd like to do. (see chalk, for example) Also, CommonJS brings a myriad of incompatibility issues with many toolings that people use, as it requires transpilation much more often, which leads to more issues around using a module, much more than ESM.

If you'd like to import this library from inside of a CommonJS project without changing the project to be an ES module, you can do this:

async function myLogic() {
    const {LlamaModel, LlamaContext, LlamaChatSession} = await import("node-llama-cpp");

    const model = new LlamaModel({ ... });
}

myLogic();

And if your tsconfig.json is configured to transpile import into require automatically, you can do this instead:

async function myLogic() {
    const {LlamaModel, LlamaContext, LlamaChatSession} = await Function('return import("node-llama-cpp")')();

    const model = new LlamaModel({ ... });
}

myLogic();