Closed tobiasgrossmann closed 1 year ago
no Error and no call of callback method.
await dalai.request({ // model: `llama.13B`, model: `${app.locals.configuration.model_type}.${app.locals.configuration.model}`, prompt: "Hallo World:", skip_end: true, }, (token) => { console.debug("token", token); process.stdout.write(token); res.write(token); })
output in shell: repo/robot/llama/main --seed -1 --threads 8 --n_predict 128 --model models/13B/ggml-model-q4_0.bin -p "Hallo World:" in repo/robot/llama
repo/robot/llama/main --seed -1 --threads 8 --n_predict 128 --model models/13B/ggml-model-q4_0.bin -p "Hallo World:" in repo/robot/llama
the path models/13B/ggml-model-q4_0.bin seems to be correct and files are there.
"dependencies": { "dalai": "^0.3.1", "express": "^4.18.2", "express-compression": "^1.0.2", "helmet": "^6.1.5", "llama": "^1.3.1", "morgan": "^1.10.0", "node-cron": "^3.0.2", "rss-parser": "^3.13.0", "serve": "^14.2.0" }
worked after reboot. Guess some memory issue.
no Error and no call of callback method.
output in shell:
repo/robot/llama/main --seed -1 --threads 8 --n_predict 128 --model models/13B/ggml-model-q4_0.bin -p "Hallo World:" in repo/robot/llama
the path models/13B/ggml-model-q4_0.bin seems to be correct and files are there.
"dependencies": { "dalai": "^0.3.1", "express": "^4.18.2", "express-compression": "^1.0.2", "helmet": "^6.1.5", "llama": "^1.3.1", "morgan": "^1.10.0", "node-cron": "^3.0.2", "rss-parser": "^3.13.0", "serve": "^14.2.0" }