ItsPi3141 / alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
MIT License
1.29k stars 144 forks source link

can there be a webui #18

Closed greagain closed 1 year ago

greagain commented 1 year ago

Hope it can become a webui, tried other alpaca frontend webui, none seems work

ItsPi3141 commented 1 year ago

This is a webui though. Its just running in a separate window.

greagain commented 1 year ago

I mean run the application on one machine, but can access it from other machine, it there a url that can be used to access the UI?

ItsPi3141 commented 1 year ago

Oh then just use my dalai fork

greagain commented 1 year ago

I tried, the web load, but didn't response.

ItsPi3141 commented 1 year ago

Did you check the console log?

greagain commented 1 year ago

Looks it is a different dalai: https://github.com/cocktailpeanut/dalai Haven't tried yours.

greagain commented 1 year ago

Not working, and the log is, it just stopped to : Server running on http://localhost:3000/

query: { method: 'installed' } modelsPath /Users/greagain/dalai/alpaca/models { modelFolders: [ '7B' ] } exists 7B modelsPath /Users/greagain/dalai/llama/models { modelFolders: [] } query: { seed: -1, threads: 16, n_predict: 6942069, model: 'alpaca.7B', top_k: 420, top_p: 0.9, temp: 0.9, repeat_last_n: 64, repeat_penalty: 1.3, debug: false, models: [ 'alpaca.7B' ], prompt: 'Can you explain quantum computing in simple terms?' } { Core: 'alpaca', Model: '7B' } exec: /Users/greagain/dalai/alpaca/main --seed -1 --threads 16 --n_predict 694206 --model models/7B/ggml-model-q4_0.bin --top_k 420 --top_p 0.9 --temp 0.9 --repat_last_n 64 --repeat_penalty 1.3 -p "Can you explain quantum computing in simpe terms?" in /Users/greagain/dalai/alpaca

ItsPi3141 commented 1 year ago

Its probably still loading the model. Just wait it some time.

greagain commented 1 year ago

It worked, but is extremely slow, basically unusable, I installed it on Mac M2 8GB. The command line based original alpaca.cpp works well.

ItsPi3141 commented 1 year ago

It worked, but is extremely slow, basically unusable, I installed it on Mac M2 8GB. The command line based original alpaca.cpp works well.

Oh I forgot it might be using the old alpaca.cpp. I might update it to llama.cpp if I have time.

greagain commented 1 year ago

Thanks. Will it need to use the new ggml file?

furroy commented 1 year ago

I'm not sure I follow that stuff above. I just want to make my own UI and make a local http request to ask the questions and get a response back.

ItsPi3141 commented 1 year ago

you should take a look at https://github.com/cocktailpeanut/dalai/ then. i'm closing this issue since it's not related to the electron app.