jlonge4 / local_llama

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
Apache License 2.0
221 stars 31 forks source link

FileNotFoundError and model LLM location #4

Open brinrbc opened 1 year ago

brinrbc commented 1 year ago

Hi! thank you very much! when I try to download a pdf, I get an error. can you please tell me what can be done here?

FileNotFoundError: [Errno 2] No such file or directory: 'WHERE YOUR PDFS ARE (SINGLE DIRECTORY)/Bruce_Bruce_2018_Practical Statistics for Data Scientists.pdf'
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
    exec(code, module.__dict__)
File "/Users/user/GitHub/local_llama/local_llama.py", line 152, in <module>
    save_pdf(file.name)
File "/Users/user/GitHub/local_llama/local_llama.py", line 103, in save_pdf
    pdf_to_index(pdf_path=f'{PATH_TO_PDFS}/{file}', save_path=f'{PATH_TO_INDEXES}/{file}')
File "/Users/user/GitHub/local_llama/local_llama.py", line 69, in pdf_to_index
    documents = loader.load_data(file=Path(pdf_path))
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/llama_index/readers/llamahub_modules/file/pdf/base.py", line 19, in load_data
    with open(file, "rb") as fp:
         ^^^^^^^^^^^^^^^^

and where to specify the path to the LLM model?

jlonge4 commented 1 year ago

So you'll want to replace "where your PDFs are" with the absolute path to your documents. For example... "C://Users/You/Bruce_Bruce_2018_Practical Statistics for Data Scientists.pdf" @brinrbc