withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://withcatai.github.io/node-llama-cpp/
MIT License
760 stars 65 forks source link

docs: cannot read properties of undefined (reading '_chatGrammar') on penalty example #130

Closed scenaristeur closed 6 months ago

scenaristeur commented 7 months ago

What was unclear or otherwise insufficient?

First, in JS and not Typescript, with the code at

https://withcatai.github.io/node-llama-cpp/guide/chat-session#repeat-penalty-customization I get

const context = new LlamaContext({model});
      ^

SyntaxError: Identifier 'context' has already been declared

next with ctx in place of context like

const ctx = new LlamaContext({model});
const session = new LlamaChatSession({
    ctx
});

i get

file:///dev/node-llama/node_modules/node-llama-cpp/dist/llamaEvaluator/LlamaChatSession.js:61
    async prompt(prompt, { onToken, signal, maxTokens, temperature, topK, topP, grammar = this.context._chatGrammar, trimWhitespaceSuffix = false, repeatPenalty } = {}) {
                                                                                                       ^

TypeError: Cannot read properties of undefined (reading '_chatGrammar')
    at LlamaChatSession.prompt (file:///node-llama/node_modules/node-llama-cpp/dist/llamaEvaluator/LlamaChatSession.js:61:104)
    at file:///dev/node-llama/penalty.js:21:26

Recommended Fix

something does not work

Additional Context

No response

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, but I don't know how to start. I would need guidance.

OH I got it , you should remove

import {context} from "esbuild";

github-actions[bot] commented 6 months ago

:tada: This issue has been resolved in version 2.8.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

github-actions[bot] commented 6 months ago

:tada: This issue has been resolved in version 3.0.0-beta.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket: