Closed jacoblee93 closed 5 months ago
@jacoblee93 node-llama-cpp
is an ESM-only module.
The issue you mentioned here happens when trying to use require(...)
to load an ESM module instead of using import
, so in this case, the user code does require("node-llama-cpp")
instead of using import
for it.
Upon investigating @langchain/community
, I found that it ships 2 compiled versions of the code - an ESM one and a CommonJS one.
You can inspect it here; under the
dist/utils
directory there are both allama_cpp.js
file (ESM) and allama_cpp.cjs
file (CommonJS)
When a user uses the ESM version of @langchain/community
, @langchain/community
then uses import
with node-llama-cpp
so everything works fine.
But when a user uses the CommonJS version of @langchain/community
(by doing require("@langchain/community")
), @langchain/community
then uses require
for node-llama-cpp
, hence this issue.
Some users might think they're using import
while their TypeScript compiler actually transpiles their code to use require
.
From my experience, supporting both ESM and CommonJS in a node module is painful because of these kinds of issues that stem from module dependencies.
Since some of the modules used in node-llama-cpp
are ESM-only and the entire node ecosystem is migrating to ESM, I've decided to make this module ESM-only.
Here are a few suggestions of how you can solve the issue that users are facing:
import
the @langchain/community
module instead of using require
when using node-llama-cpp
.This is not always easy when users have their TypeScript compiler configured to transpile import
statements into require
statements.
They can use the hack in the following suggestion, but it would make their code more cumbersome.
import
in the CommonJS code inside @langchain/community
that uses node-llama-cpp
.To make sure that TypeScript doesn't transpile this specific code to use require(...)
you can do something like that:
export async function getModel(...) {
const {LlamaModel} = await Function('return import("node-llama-cpp")')();
const model = new LlamaModel({ ... });
}
The con of this approach is that it forces you to change the interface to expose only async functions, and this would degrade the developer experience for both CommonJS and ESM users.
If you notice that people encounter similar issues with other module integrations, doing this will lower the maintenance burden. Although this may be a good idea in the long term, it is not a simple solution as this is a breaking change.
Closing due to inactivity.
If you still encounter issues with node-llama-cpp
, let me know and I'll try to help.
Ah got a bit sidetracked. But thank you for the response, will try that out!
Feature Description
Hey folks, first of all thanks for the fantastic work!
I'm a LangChain.js maintainer and we've had a few folks struggle to use
node-llama-cpp
through our integration with ESM (for example: https://github.com/langchain-ai/langchainjs/issues/4181). They get errors like the following:I think it may be possible to fix using a solution like the one above, but would be nice to support ESM natively.
The Solution
Configure the package to export both
.mjs
and.cjs
files.Considered Alternatives
Some kind of dynamic import within the LangChain wrapper, but adding this would help others use
node-llama-cpp
directly.Additional Context
I can try to have a look at this as well, but can't promise any timeline.
Related Features to This Feature Request
Are you willing to resolve this issue by submitting a Pull Request?
No, I don’t have the time and I’m okay to wait for the community / maintainers to resolve this issue.