h2oai / h2ogpt

Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
http://h2o.ai
Apache License 2.0
11.24k stars 1.23k forks source link

IndexError: list index out of range #513

Closed erwinrnasution closed 1 year ago

erwinrnasution commented 1 year ago

Dear Rob,

After succesfully Loading checkpoint shards: 100%, my computer return this:

Traceback (most recent call last): File "C:\Users\erwinnella\Desktop\h2ogpt\generate.py", line 16, in entrypoint_main() File "C:\Users\erwinnella\Desktop\h2ogpt\generate.py", line 12, in entrypoint_main fire.Fire(main) File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\fire\core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\fire\core.py", line 475, in _Fire component, remaining_args = _CallAndUpdateTrace( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace component = fn(*varargs, kwargs) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\Desktop\h2ogpt\src\gen.py", line 674, in main return run_cli(get_kwargs(run_cli, exclude_names=['model_state0'], locals())) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\Desktop\h2ogpt\src\cli.py", line 64, in run_cli model, tokenizer, device = get_model(reward_type=False, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\Desktop\h2ogpt\src\gen.py", line 1054, in get_model return get_hf_model(load_8bit=load_8bit, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\Desktop\h2ogpt\src\gen.py", line 1187, in get_hf_model model = model_loader( ^^^^^^^^^^^^^ File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\transformers\models\auto\auto_factory.py", line 479, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\transformers\modeling_utils.py", line 2937, in from_pretrained dispatch_model(model, kwargs) File "C:\Users\erwinnella\miniconda3\envs\h2ogpt\Lib\site-packages\accelerate\big_modeling.py", line 336, in dispatch_model main_device = [d for d in device_map.values() if d not in ["cpu", "disk"]][0]


IndexError: list index out of range

(h2ogpt) C:\Users\erwinnella\Desktop\h2ogpt>

Please help me, since it's almost done (downloading about 15 GB of data) but why it didn't return "Enter an instruction" like in your youtube...? This is quite frustrating, cause it's almost done...
erwinrnasution commented 1 year ago

After I run >python generate.py --base_model=h2oai/h2ogpt-gm-oasst1-en-2048-falcon-7b-v3 --score_model =None --prompt_type=human_bot --cli-True --load_4bit=True

And then I click http://0.0.0.0:7860 the page return:

Hmmm… can't reach this page It looks like the webpage at http://0.0.0.0:7860/ might be having issues, or it may have moved permanently to a new web address. ERR_ADDRESS_INVALID

How to solve this problem. Thanks.

pseudotensor commented 1 year ago

Hi @erwinrnasution I'm unsure what problem is occurring inside torch. It's probably related to your system, since not normal to occur. Can you give some details? I see it's windows, but do you have CPU? If you have CPU, I recommend the --base_model='llama' option, not the full hugging face models.

erwinrnasution commented 1 year ago

Dear @pseudotensor, I don't have any technical background :-), what do you mean by CPU? I'm running this on a Windows 11 laptop. When I tried the -- base_model='llama' option, the Anaconda Power Shell Prompt returned this "ModuleNotFoundError: No module named 'fire' ".

pseudotensor commented 1 year ago

The above sounds like you didn't activate your env.

For the address issue, you should use 127.0.0.1:7680 as in the readme for windows. What gradio itself returns is wrong for windows/mac.