withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
829 stars 80 forks source link

Naming the `LlamaModel` anything but `model`makes the app crash when creating a new `LlamaContext` #120

Closed stduhpf closed 8 months ago

stduhpf commented 8 months ago

Issue description

Like the title say, I can't create a context for a model that is not named model

Expected Behavior

I should be able to name my variables however I want.

Actual Behavior

I can't.

Steps to reproduce

import "dotenv/config";

import {
  LlamaModel,
  LlamaContext,
} from "node-llama-cpp";

import path from "path";

const llama_root = process.env.LLAMA_MODEL_PATH;

const llamaModel = new LlamaModel({
  modelPath: path.join(llama_root, "models", "7B", "ggml-model_Q5_K_M"),
});
new LlamaContext({ llamaModel, threads: 4 });

console.log("Here!");

Crashes with:

 file:///.../llamabot/node_modules/node-llama-cpp/dist/llamaEvaluator/LlamaContext.js:13
    constructor({ model, prependBos = true, grammar, seed = model._contextOptions.seed, contextSize = model._contextOptions.contextSize, batchSize = model._contextOptions.batchSize, logitsAll = model._contextOptions.logitsAll, embedding = model._contextOptions.embedding, threads = model._contextOptions.threads }) {
                                                                  ^
 TypeError: Cannot read properties of undefined (reading '_contextOptions')
    at new LlamaContext (file:///.../llamabot/node_modules/node-llama-cpp/dist/llamaEvaluator/LlamaContext.js:13:67)
    at ...

but

    import "dotenv/config";

import {
  LlamaModel,
  LlamaContext,
} from "node-llama-cpp";

import path from "path";

const llama_root = process.env.LLAMA_MODEL_PATH;

const model = new LlamaModel({
  modelPath: path.join(llama_root, "models", "7B", "ggml-model_Q5_K_M"),
});
new LlamaContext({ model, threads: 4 });

console.log("Here!");
prints `Here!` eas expected and exits without a crash

My Environment

Dependency Version
Operating System Windows 10
CPU Ryzen 9 (5900X)
Node.js version 1.1.9
Typescript version doesn't matter here, that's a runtime issue
node-llama-cpp version 2.8.2

Additional Context

No response

Relevant Features Used

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, but I don't know how to start. I would need guidance.

stduhpf commented 8 months ago

I mean I can just specify model:llamaModel. I realize only after submitting the issue.... Should i close it already?