ItsPi3141 / alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
MIT License
1.29k stars 144 forks source link

Works extremely slowly #15

Closed PierrotLaLune closed 1 year ago

PierrotLaLune commented 1 year ago

I run Windows 10 with an i5, 16 GB Ram and an SSD.

I get one word every couple seconds to the question "Can you explain quantum computing in simple terms?" CPU is at 100%, 98.4% from alpaca-electron.

Doc1.pdf

ItsPi3141 commented 1 year ago

Oh I recently made a change to make it use all threads. It's affecting the performance so I'll revert it in the next update.

Nocturna22 commented 1 year ago

It would be nice if you could provie an option, where the user can set the threads and other options :) (Like the temp for example, because different models need different temp)

ItsPi3141 commented 1 year ago

It would be nice if you could provie an option, where the user can set the threads and other options :) (Like the temp for example, because different models need different temp)

Yes I'm working on a settings page