withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
836 stars 82 forks source link

Error Creating Jinja Template Chat Wrapper in node-llama-cpp Latest Beta Version #275

Closed sooraj007 closed 1 month ago

sooraj007 commented 1 month ago

Issue description

When attempting to run llama 3.1 using the node-llama-cpp latest beta version, an error occurs during the creation of the Jinja template chat wrapper. The specific error message indicates a problem with the sanity test for the provided Jinja template.

Expected Behavior

The Jinja template chat wrapper should be created without errors, allowing llama 3.1 to run successfully.

Actual Behavior

The process fails with an error indicating an issue with the sanity test for the Jinja template.

Error

Error: The provided Jinja template failed that sanity test: Error: Expected iterable type in for loop: got UndefinedValue at JinjaTemplateChatWrapper._runSanityTest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/generic/JinjaTemplateChatWrapper.ts:424:19) at new JinjaTemplateChatWrapper (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/generic/JinjaTemplateChatWrapper.ts:128:47) at resolveChatWrapper (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/utils/resolveChatWrapper.ts:228:24) at new LlamaChat (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/evaluator/LlamaChat/LlamaChat.ts:303:17) at new LlamaChatSession (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/evaluator/LlamaChatSession/LlamaChatSession.ts:303:22) at G:\personaldev\node-api-boilerplae-creator\node_llama\src\index.ts:18:17 at ViteNodeRunner.runModule (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:362:5) at ViteNodeRunner.directRequest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:346:5) at ViteNodeRunner.cachedRequest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:189:14) at ViteNodeRunner.executeFile (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:161:12)

Steps to reproduce

npm run start

Created using the tempalte command provided in the documentaion

My Environment

Node.js Version: 21.x Operating System: windows Version of node-llama-cpp: Latest beta version : 3.0.0-beta.39

Additional Context

Its working correctly but the above error warning is showing

Relevant Features Used

Are you willing to resolve this issue by submitting a Pull Request?

No, I don’t have the time and I’m okay to wait for the community / maintainers to resolve this issue.

giladgd commented 1 month ago

Please provide a link to the model you used so I can investigate. From my tests, this model works as expected.

sooraj007 commented 1 month ago

"models:pull": "node-llama-cpp pull --dir ./models \"https://huggingface.co/bartowski/Meta-Llama-3.1-8B-Instruct-GGUF/resolve/main/Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf\"",

so this is what i am using and i created project using this npm create --yes node-llama-cpp@beta

sooraj007 commented 1 month ago

I haven't done any customization to the code just created the project and try to run it and got those errors , but modal able to run but getting those error kind of messages

giladgd commented 1 month ago

I see that the model was updated since I added it to the recommendation list, and it now includes a non-standard chat template that's incompatible with generic Jinja handlers. I'll release a new beta version that uses a different source for Llama 3.1 model recommendations that includes a standard chat template, and seems to work as expected in my tests.

Thanks for reporting this issue :)

github-actions[bot] commented 1 month ago

:tada: This issue has been resolved in version 3.0.0-beta.40 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

sooraj007 commented 1 month ago

Hey bro you are great, love your commitment, thanks

On Tue, 30 Jul 2024, 11:28 pm github-actions[bot], @.***> wrote:

🎉 This issue has been resolved in version 3.0.0-beta.40 🎉

The release is available on:

Your semantic-release https://github.com/semantic-release/semantic-release bot 📦🚀

— Reply to this email directly, view it on GitHub https://github.com/withcatai/node-llama-cpp/issues/275#issuecomment-2258905188, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACYUHIFJWHPS6FI6RDIFXNDZO7H5ZAVCNFSM6AAAAABLUJK7OKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYHEYDKMJYHA . You are receiving this because you authored the thread.Message ID: @.***>

sooraj007 commented 1 month ago

@giladgd Hi again facing the same I pull the latest besta and tested , the same issue is throwing

{ "name": "node_llama", "private": true, "version": "0.0.0", "main": "./dist/index.js", "type": "module", "types": "./dist/index.d.ts", "files": [ "dist/", "package.json", "README.md" ], "exports": { ".": { "import": "./dist/index.js", "node": "./dist/index.js", "types": "./dist/index.d.ts", "default": "./dist/index.js" } }, "engines": { "node": ">=18.0.0" }, "scripts": { "postinstall": "npm run models:pull", "models:pull": "node-llama-cpp pull --dir ./models \"https://huggingface.co/bartowski/Meta-Llama-3.1-8B-Instruct-GGUF/resolve/main/Meta-Llama-3.1-8B-Instruct-Q5_K_M.gguf\"", "start": "vite-node ./src/index.ts", "start:build": "node ./dist/index.ts", "prebuild": "rimraf ./dist ./tsconfig.tsbuildinfo", "build": "tsc --build tsconfig.json --force", "lint": "npm run lint:eslint", "lint:eslint": "eslint --ext .js --ext .ts --report-unused-disable-directives .", "format": "npm run lint:eslint -- --fix", "clean": "rm -rf ./node_modules ./dist ./tsconfig.tsbuildinfo ./models" }, "dependencies": { "chalk": "^5.3.0", "node-llama-cpp": "^3.0.0-beta.40" }, "devDependencies": { "@types/node": "^20.14.2", "@typescript-eslint/eslint-plugin": "^7.12.0", "@typescript-eslint/parser": "^7.12.0", "eslint": "^8.46.0", "eslint-plugin-import": "^2.29.1", "eslint-plugin-jsdoc": "^46.9.0", "eslint-plugin-n": "^17.8.1", "rimraf": "^5.0.7", "tslib": "^2.6.3", "typescript": "^5.4.5", "vite-node": "^1.4.0" } }

G:\personaldev\node-api-boilerplae-creator\node_llama>npm run start

node_llama@0.0.0 start vite-node ./src/index.ts

[node-llama-cpp] Error creating Jinja template chat wrapper. Falling back to resolve other chat wrappers. Error: Error: The provided Jinja template failed that sanity test: Error: Expected iterable type in for loop: got UndefinedValue at JinjaTemplateChatWrapper._runSanityTest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/generic/JinjaTemplateChatWrapper.ts:424:19) at new JinjaTemplateChatWrapper (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/generic/JinjaTemplateChatWrapper.ts:128:47) at resolveChatWrapper (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/chatWrappers/utils/resolveChatWrapper.ts:228:24) at new LlamaChat (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/evaluator/LlamaChat/LlamaChat.ts:303:17) at new LlamaChatSession (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/node-llama-cpp/src/evaluator/LlamaChatSession/LlamaChatSession.ts:303:22) at G:\personaldev\node-api-boilerplae-creator\node_llama\src\index.ts:17:17 at ViteNodeRunner.runModule (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:362:5) at ViteNodeRunner.directRequest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:346:5) at ViteNodeRunner.cachedRequest (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:189:14) at ViteNodeRunner.executeFile (file:///G:/personaldev/node-api-boilerplae-creator/node_llama/node_modules/vite-node/dist/client.mjs:161:12)

screenshot attached below image

giladgd commented 1 month ago

@sooraj007 I’ve updated the model recommendations to use a different source for Llama 3.1. Your code still refers to the older model.

Run this command again to generate a new project with the new model:

npm create --yes node-llama-cpp@beta