jlonge4 / local_llama

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
Apache License 2.0
221 stars 31 forks source link

chunk_overlap_ratio must be a float between 0. and 1 #7

Closed Zildj1an closed 1 year ago

Zildj1an commented 1 year ago

Using macOS Monterrey (v12.6), I run:

$ python -m streamlit run local_llama.py

and get the error:

ValueError: chunk_overlap_ratio must be a float between 0. and 1.
Traceback:

File "/path/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
File "/path/local_llama.py", line 30, in <module>
    prompt_helper = PromptHelper(max_input_size, num_output, max_chunk_overlap)
File "/path/prompt_helper.py", line 72, in __init__
    raise ValueError("chunk_overlap_ratio must be a float between 0. and 1.")

Made with Streamlit

I have a fix and plan to do a pull request.