ItsPi3141 / alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
MIT License
1.29k stars 144 forks source link

GPU Support #49

Closed MushMello closed 1 year ago

MushMello commented 1 year ago

There are plenty of good models out there with GPU support, so people like me who have some higher end PCs could use the resources we have to the fullest

ItsPi3141 commented 1 year ago

Duplicate of #43