Atome-FE / llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
https://llama-node.vercel.app/
Apache License 2.0
865 stars 63 forks source link

Support for the new k-quant methods in Llama.cpp #95

Open synw opened 1 year ago

synw commented 1 year ago

It would be great to have this support to use the new models with these k-quants