cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.09k stars 1.43k forks source link

Support for Llama-2? #462

Open g1sbi opened 11 months ago

g1sbi commented 11 months ago

Is support for llama-2 planned?

Daemon-Devarshi commented 11 months ago

Much needed request!

recool903 commented 11 months ago

waiting and expecting

Alex-Kondakov commented 11 months ago

Totally must-have feature

mirek190 commented 11 months ago

lol

Stop using that ancient dead project and go to llamacpp or koboldcpp ... Also download models from https://huggingface.co/TheBloke ggml versions