iimez / lllms

6 stars 1 forks source link

path error on windows computer: "path: 'C:\\C:\\Users\\avnig\\www\\lllms\\dist\\engines\\node-llama-cpp\\grammars'" #2

Closed avnigashi closed 3 months ago

avnigashi commented 3 months ago

System: Windows 11, AMD CPU/GPU Node: v20.15.0 installed nativ via nvm, no wsl subsystem.

Happens, when i start examples\express-openai.js after building

$ node .\examples\express-openai.js
Initializing models...
[2024-07-04 9:46:03.02][erro] dolphin:9NtdpIGv Failed to loa
d instance: Error: ENOENT: no such file or directory, scandir
 'C:\C:\Users\avnig\www\lllms\dist\engines\node-llama-cpp\grammars'
    at Object.readdirSync (node:fs:1507:26)
    at readGrammarFiles (file:///C:/Users/avnig/www/lllms/dist/engines/node-llama-cpp/util.js:91:10)
    at Module.createInstance (file:///C:/Users/avnig/www/lllms/dist/engines/node-llama-cpp/engine.js:95:28)
    at async ModelInstance.load (file:///C:/Users/avnig/www/lllms/dist/instance.js:61:35)
    at async ModelPool.spawnInstance (file:///C:/Users/avnig/www/lllms/dist/pool.js:229:9)
    at async Promise.allSettled (index 0)
    at async ModelPool.init (file:///C:/Users/avnig/www/lllms/dist/pool.js:70:9)
    at async Promise.all (index 1)
    at async ModelServer.start (file:///C:/Users/avnig/www/lllms/dist/server.js:96:9)
    at async file:///C:/Users/avnig/www/lllms/examples/express-openai.js:41:3 {
  errno: -4058,
  code: 'ENOENT',
  syscall: 'scandir',
  path: 'C:\\C:\\Users\\avnig\\www\\lllms\\dist\\engines\\node-llama-cpp\\grammars'
}

Found the bug: src/engines/node-llama-cpp/engine.ts:153