ViperX7 / Alpaca-Turbo

Web UI to run alpaca model locally
GNU Affero General Public License v3.0
876 stars 92 forks source link

Copied the release Alpaca-Turbo-beta_v0.6 #64

Open chongy076 opened 1 year ago

chongy076 commented 1 year ago

copied all files from Alpaca-Turbo-beta_v0.6 to Alpaca-Turbo downloaded ggml-vicuna-13b-1.1 to Alpaca-Turbo/models folder inside have (blobs,refs,snapshots) files also tried download ggml-alpaca-7b-q4.bin into models folder as well. run python api.py

[on the web localhost:7887 ] Ask the bot: aaa Submit Bot response:

History:

console 127.0.0.1 - - [15/Apr/2023 16:13:49] "GET / HTTP/1.1" 200 - 127.0.0.1 - - [15/Apr/2023 16:13:58] "POST /completions HTTP/1.1" 405 -< upon click submit error also didn't have the select model page. tried run in gradio_impl folder, also tried run alpaca_turbo.py. found downloadH and fixed. but still have above error 405. Noticed , it is much harder to get it running on Window compared on ubuntu (only couple of tried and fix can run) Thank for the guide.

espressoelf commented 1 year ago

copied all files from Alpaca-Turbo-beta_v0.6 to Alpaca-Turbo tried run in gradio_impl folder Sounds like you downloaded the source code. You need to download Alpaca-Turbo_0.6.zip instead. The release doesn't contain a gradio_impl folder.

chongy076 commented 1 year ago

I had tried run from gradio_impl , it said set the model.

ii had download Alpaca_turbo _0.6,zip and put into a clean Alpaca-turbo and inside have gradio_impl.

(alpaca_turbo) C:\Users\Administrator\Alpaca-Turbo\gradio_impl>python api.py Set the model path in settings

when I inspect the api.py it doesn't consist of flask , so this is not even the same api as the Alpaca-Turbo folder .

chongy076 commented 1 year ago

After resetup ,

  1. download Alpaca-Turbo_0.6.zip

  2. download ggml-alpaca-7b-q4.bin and also ggml-vicuna-13b-1.1 in a folder

  3. conda init and close window

  4. conda create -n alpaca_turbo python=3.8 -y conda activate alpaca_turbo pip install -r requirements.txt python api.py

  5. the web can show and click load model ggml-alpaca-7b-q4.bin, no error.

  6. console show no error ,but nothing happened click send after type hello. But on web debug got load model error when click on new chat.

chongy076 commented 1 year ago

image

AnouarAndichi commented 1 year ago

Hi @chongy076, as I can see from the last screen you provided that the model is not loaded, so the chat won't send if the model didn't load. When the model loaded you will get a notification that model xx has been loaded, and you will get a button to reload the model under Load Model in sidebar.

did you try Vicuna Model?

chongy076 commented 1 year ago

@AnouarAndichi I tried ggml-vicuna-13b-4bit-rev1.bin , ggml-vicuna-13b-4bit-rev1.1.bin ,ggml-vicuna-7b-4bit.bin. But thanks for the tips, I think i may need to modify the load model code. it did mentioned somethings on the load model.