Anil-matcha / ChatPDF

Chat with any PDF. Easily upload the PDF documents you'd like to chat with. Instant answers. Ask questions, extract information, and summarize documents with AI. Sources included.
https://www.thesamur.ai/?utm_source=github&utm_medium=link&utm_campaign=github_chatpdf
MIT License
1.38k stars 213 forks source link

use LLM from huggingface #13

Open stl2015 opened 1 year ago

stl2015 commented 1 year ago

Hi, I'm trying to use some LLM model from huggingface, for example "lmsys/vicuna-13b-v1.3". The model could be fetched through AutoModelForCausalLM.from_pretrained. However, what's the best way to wrap the model for integration with load_qa_chain?