wafflecomposite / langchain-ask-pdf-local

An AI-app that allows you to upload a PDF and ask questions about it. It uses StableVicuna 13B and runs locally.
87 stars 8 forks source link

No file named stable-vicuna-13B.ggml.q4_2.bin in linked Huggingface repo #1

Open uwts opened 1 year ago

uwts commented 1 year ago

The readme instructs the user to download stable-vicuna-13B.ggml.q4_2.bin from a linked repo. That file does not appear in the repo.

wafflecomposite commented 1 year ago

It turns out that the llama was updated, and stable-vicuna images was re-quantised. I can't check right now if it's still working, but I have fixed versions in this repo's requirements.txt, and previous model images are available there https://huggingface.co/TheBloke/stable-vicuna-13B-GGML/tree/previous_llama

Alternatively, you can try downloading one of the new q4 or q5 images, install the latest llama-cpp-python, and change the image file name in app.py accordingly.

Whatever you will go with, please let me know if it works or not so I can update the repository.

LebToki commented 1 year ago

This is their latest stable-vicuna-13B.ggmlv3.q8_0.bin (12.9 GB)