cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.1k stars 1.42k forks source link

Possible solution for Windows users - LLama not working. #450

Open Gary6780 opened 1 year ago

Gary6780 commented 1 year ago

Hello!

I had the issue, that after the Llama installation Llama didn't respond to any input.Alpaca worked fine. I found out, that i had to copy the three files from C:\Users\USERNAME\dalai\llama\build\bin\Release to the directory C:\Users\USERNAME\dalai\llama\build\Release. I restarted the server and Llama worked.

In the case someone runs into the same problem.

neinja007 commented 1 year ago

I have the completely opposite issue: Alpaca doesn't respons when I click "GO", and I haven't tried llama on its own (idk how)

mirek190 commented 1 year ago

Stop using that ancient dead project and go to llamacpp or koboldcpp ... Also download models from https://huggingface.co/TheBloke ggml versions

junxian428 commented 1 year ago

mine works but maybe mine model 7B so outcome not desirable ![Uploading image.png…]()

junxian428 commented 1 year ago

image