Atome-FE / llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
https://llama-node.vercel.app/
Apache License 2.0
865 stars 63 forks source link

GGUF support? #122

Open nildotdev opened 12 months ago

nildotdev commented 12 months ago

Are there any plans for GGUF support? AFAIK GGUF models do not work as of right now, so I'm curious when I can use them.