withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

Not worlking as intended. #79

Closed arthurwolf closed 10 months ago

arthurwolf commented 10 months ago

Issue description

Following instructions as-is just doesn' t work.

Expected Behavior

Working.

Actual Behavior

Not working.

Steps to reproduce

I follow the exact instructions at https://www.npmjs.com/package/node-llama-cpp (npm install and copy/paste code into .js file)

I try running (only changing the line with the .gguf file to using a file on my harddrive) it I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:43:11
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
(node:2540491) Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.
(Use `node --trace-warnings ...` to show where the warning was created)
/home/arthur/dev/ai/llmi/src/structure.js:2
import {LlamaModel, LlamaContext, LlamaChatSession} from "node-llama-cpp";
^^^^^^

SyntaxError: Cannot use import statement outside a module
    at internalCompileFunction (node:internal/vm:73:18)
    at wrapSafe (node:internal/modules/cjs/loader:1153:20)
    at Module._compile (node:internal/modules/cjs/loader:1197:27)
    at Module._extensions..js (node:internal/modules/cjs/loader:1287:10)
    at Module.load (node:internal/modules/cjs/loader:1091:32)
    at Module._load (node:internal/modules/cjs/loader:938:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12)
    at node:internal/main/run_main_module:23:47

Google recommends I change to:

//import {LlamaModel, LlamaContext, LlamaChatSession} from "node-llama-cpp";
const { LlamaModel, LlamaContext, LlamaChatSession } = require('node-llama-cpp');

So I do that (am I wrong or is it impossible for the example given in the README to work...?), and I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:43:15
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
/home/arthur/dev/ai/llmi/src/structure.js:15
const a1 = await session.prompt(q1);
           ^^^^^

SyntaxError: await is only valid in async functions and the top level bodies of modules
    at internalCompileFunction (node:internal/vm:73:18)
    at wrapSafe (node:internal/modules/cjs/loader:1153:20)
    at Module._compile (node:internal/modules/cjs/loader:1197:27)
    at Module._extensions..js (node:internal/modules/cjs/loader:1287:10)
    at Module.load (node:internal/modules/cjs/loader:1091:32)
    at Module._load (node:internal/modules/cjs/loader:938:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12)
    at node:internal/main/run_main_module:23:47

Node.js v20.5.1

So I put the awaits inside an async:

// Async hell.
(async () => {

    const q1 = "Hi there, how are you?";
    console.log("User: " + q1);

    const a1 = await session.prompt(q1);
    console.log("AI: " + a1);

    const q2 = "Summerize what you said";
    console.log("User: " + q2);

    const a2 = await session.prompt(q2);
    console.log("AI: " + a2);

})();

now I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:44:31
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
/home/arthur/dev/ai/llmi/src/structure.js:3
const { LlamaModel, LlamaContext, LlamaChatSession } = require('node-llama-cpp');
                                                       ^

Error [ERR_REQUIRE_ESM]: require() of ES Module /home/arthur/dev/ai/llmi/src/node_modules/node-llama-cpp/dist/index.js from /home/arthur/dev/ai/llmi/src/structure.js not supported.
Instead change the require of index.js in /home/arthur/dev/ai/llmi/src/structure.js to a dynamic import() which is available in all CommonJS modules.
    at Object.<anonymous> (/home/arthur/dev/ai/llmi/src/structure.js:3:56) {
  code: 'ERR_REQUIRE_ESM'
}

Node.js v20.5.1
╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:44:51
╰─⠠⠵         

At that point I just give up...

(Note this is after nearly an hour trying to get this module to work with ts-node, and utterly failing, despite trying DOZENS of things from Google and ChatGPT... I use thousands of modules from npm in ts projects, this is the first time I get this much trouble... whfich is why I failed back to trying to run it with node (instead of ts-node) to simplify the issue, and as you can see above, even that fails...)

I'm at a loss...

Any help welcome.

My Environment

Latest Ubuntu, Node 20.5.1

Additional Context

No response

Relevant Features Used

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

paul-oms commented 10 months ago

It's failing because you're using Imports outside of a node module. Nothing to do with this project. The node error message describes a fix - Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.

Therefore, rename your original file that uses Imports from .js to .mjs and run node filename.mjs and it will work

giladgd commented 10 months ago

@arthurwolf node-llama-cpp is an ES module, so you can only import it and cannot use it with require, and to do so your project has to be an ES module as well.

Just like the error you've got says:

(node:2540491) Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.

You should add "type": "module" to the package.json of your project, and then the code will work for you.

I'll add a section in the getting started guide to explain this better.

github-actions[bot] commented 10 months ago

:tada: This issue has been resolved in version 2.7.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket: