cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.09k stars 1.43k forks source link

Alpaca doesn't respond #464

Open neinja007 opened 11 months ago

neinja007 commented 11 months ago

Hey!

I installed Alpaca and Llama (after trying 3000 times), but it doesnt seem to work. It doesnt work on firefox anyway, but when I tried in Chrome, another issue created itself. When I click "go", the button just freezes and nothing happens.

Images included

Latest version of python and node.js

image 1

[EDIT] The "threads" input was on 4, I just played around to find out why it might not work.

Using Google Chrome

mirek190 commented 11 months ago

omg ... stop using that ancient software and go to llamacpp or koboldcpp ... Also download models from https://huggingface.co/TheBloke ggml versions