Atome-FE / llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
https://llama-node.vercel.app/
Apache License 2.0
862 stars 62 forks source link

Llama.cpp Typescript: Cannot find name 'LoadModel' #91

Closed synw closed 1 year ago

synw commented 1 year ago

There is an error when compiling Typescript code using Llama.cpp:

$ tsc -p .
node_modules/@llama-node/llama-cpp/index.d.ts:137:31 - error TS2304: Cannot find name 'LoadModel'.

137   static load(params: Partial<LoadModel>, enableLogger: boolean): Promise<LLama>

I found that changing in llama-cpp/index.d.ts line 137 static load(params: Partial<LoadModel> by static load(params: Partial<ModelLoad> works. It is probably a typo. I found another reference to LoadModel in llama-cpp/src/lib.rs, so unfortunately as I don't know Rust I can not suggest a valid fix in a PR.