h2oai / h2ogpt

Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
http://h2o.ai
Apache License 2.0
11.38k stars 1.25k forks source link

Latest commits break the code #1076

Closed hpxiong closed 11 months ago

hpxiong commented 11 months ago

Just pull the latest code base and now I'm getting the following error. I reverted back to commit #3fc95069 and it works as expected.


 python generate.py --base_model='llama' --model_path_llama="***\\Documents\model\llama-2-7b-chat.Q6_K.gguf" --prompt_type=llama2 --score_model=None --langchain_mode='UserData' --user_path=user_path --max_seq_len=4096 --share=False --visible_side_bar=False --visible_chat_tab=True --visible_doc_selection_tab=True --visible_doc_view_tab=False --visible_chat_history_tab=False --visible_expert_tab=False --visible_models_tab=False --visible_system_tab=True --visible_tos_tab=False --visible_hosts_tab=False --visible_h2ogpt_header=True --visible_login_tab=True --visible_submit_buttons=False --gradio_offline_level=1
Traceback (most recent call last):
  File "***\\repo\h2ogpt\generate.py", line 16, in <module>
    entrypoint_main()
  File "***\\repo\h2ogpt\generate.py", line 12, in entrypoint_main
    H2O_Fire(main)
  File "***\\repo\h2ogpt\src\utils.py", line 64, in H2O_Fire
    fire.Fire(component=component, command=args)
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "***\\repo\h2ogpt\src\gen.py", line 1196, in main
    url_loaders_options0, url_loaders_options = lg_to_gr(**locals())
  File "***\\repo\h2ogpt\src\utils.py", line 1392, in lg_to_gr
    assert set(image_audio_loaders_options0).issubset(image_audio_loaders_options)
AssertionError
pseudotensor commented 11 months ago

Please run on latest and see what you see. I don't hit any issue with your command on main.

python generate.py --base_model='llama' --model_path_llama=llama-2-7b-chat.Q6_K.gguf --prompt_type=llama2 --score_model=None --langchain_mode='UserData' --user_path=user_path --max_seq_len=4096 --share=False --visible_side_bar=False --visible_chat_tab=True --visible_doc_selection_tab=True --visible_doc_view_tab=False --visible_chat_history_tab=False --visible_expert_tab=False --visible_models_tab=False --visible_system_tab=True --visible_tos_tab=False --visible_hosts_tab=False --visible_h2ogpt_header=True --visible_login_tab=True --visible_submit_buttons=False --gradio_offline_level=1
pseudotensor commented 11 months ago

I think the above fixed it, for those who didn't updated packages for audio transcription.

hpxiong commented 11 months ago

I think the above fixed it, for those who didn't updated packages for audio transcription.

I pulled your commit and now getting another error below. Tried to Google but couldn't find much info on the error.

Using Model llama Traceback (most recent call last): File "\repo\h2ogpt\generate.py", line 16, in entrypoint_main() File "\repo\h2ogpt\generate.py", line 12, in entrypoint_main H2O_Fire(main) File "\repo\h2ogpt\src\utils.py", line 64, in H2O_Fire fire.Fire(component=component, command=args) File "\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 475, in _Fire component, remaining_args = _CallAndUpdateTrace( File "\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace component = fn(varargs, kwargs) File "\repo\h2ogpt\src\gen.py", line 1264, in main from src.gpt_langchain import get_embedding File "\repo\h2ogpt\src\gpt_langchain.py", line 50, in from src.pandas_agent_langchain import create_csv_agent, create_pandas_dataframe_agent File "\repo\h2ogpt\src\pandas_agent_langchain.py", line 5, in from langchain._api import warn_deprecated ImportError: cannot import name 'warn_deprecated' from 'langchain._api' (***\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\langchain_api__init__.py)

hpxiong commented 11 months ago

The above error was introduced in this commit #0058a0ce in pandas_agent_langchain.py.

My langchian libs are below.

langchain 0.0.300 langchain-experimental 0.0.33

pseudotensor commented 11 months ago

For h2oGPT one has to use the same versions as we set, e.g.:

langchain==0.0.321
langchain-experimental==0.0.33

There are constantly breaking changes in all these AI/LLM related packages. You have to ensure the env is up to date.

I don't have any issue with the above versions.

hpxiong commented 11 months ago

Well, interestingly, updating the library to either the above version or the latest version still did not work. 😓


The latest: 
langchain                 0.0.333
langchain-experimental    0.0.39

Traceback (most recent call last):
  File "***\\repo\h2ogpt\generate.py", line 16, in <module>
    entrypoint_main()
  File "***\\repo\h2ogpt\generate.py", line 12, in entrypoint_main
    H2O_Fire(main)
  File "***\\repo\h2ogpt\src\utils.py", line 64, in H2O_Fire
    fire.Fire(component=component, command=args)
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "***\\repo\h2ogpt\src\gen.py", line 1264, in main
    from src.gpt_langchain import get_embedding
  File "***\\repo\h2ogpt\src\gpt_langchain.py", line 43, in <module>
    from langchain.tools import PythonREPLTool
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\langchain\tools\__init__.py", line 819, in __getattr__
    return _import_python_tool_PythonREPLTool()
  File "***\\AppData\Local\miniconda3\envs\h2ogpt\lib\site-packages\langchain\tools\__init__.py", line 465, in _import_python_tool_PythonREPLTool
    raise ImportError(
ImportError: This tool has been moved to langchain experiment. This tool has access to a python REPL. For best practices make sure to sandbox this tool. Read https://github.com/langchain-ai/langchain/blob/master/SECURITY.md To keep using this code as is, install langchain experimental and update relevant imports replacing 'langchain' with 'langchain_experimental'

If I changed the import based on the error suggestion above, it didn't work neither,


    from langchain_experimental.tools.json.tool import JsonSpec
ModuleNotFoundError: No module named 'langchain_experimental.tools.json'
pseudotensor commented 11 months ago

Ya latest won't work. But what did you get when you used

langchain==0.0.321
langchain-experimental==0.0.33
hpxiong commented 11 months ago

langchain==0.0.321 langchain-experimental==0.0.33

👍 This is the working now. But strangely, when I updated to this combination earlier, it failed. But now it worked. maybe cache?

pseudotensor commented 11 months ago

Unsure, but glad it works, thanks for checking!

hpxiong commented 11 months ago

It looks like the latest commits break code again. The above langchian_experimental 0.0.33 does not work anymore and need to update to 0.0.42.

Also commit 2da43c256 causes the following error. I played with different pydantic versions and no success

@pseudotensor any suggestions?

 Traceback (most recent call last):
  File "\h2ogpt\generate.py", line 16, in <module>
    entrypoint_main()
  File "\h2ogpt\generate.py", line 12, in entrypoint_main
    H2O_Fire(main)
  File "\h2ogpt\src\utils.py", line 65, in H2O_Fire
    fire.Fire(component=component, command=args)
  File "\miniconda3\envs\h2oGPT\lib\site-packages\fire\core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "\miniconda3\envs\h2oGPT\lib\site-packages\fire\core.py", line 475, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "\miniconda3\envs\h2oGPT\lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "\h2ogpt\src\gen.py", line 1392, in main
    from src.gpt_langchain import get_embedding
  File "\h2ogpt\src\gpt_langchain.py", line 498, in <module>
    class GradioInference(H2Oagenerate, LLM):
  File "\h2ogpt\src\gpt_langchain.py", line 546, in GradioInference
    @root_validator()
  File "\miniconda3\envs\h2oGPT\lib\site-packages\pydantic\deprecated\class_validators.py", line 228, in root_validator
    raise PydanticUserError(
pydantic.errors.PydanticUserError: If you use `@root_validator` with pre=False (the default) you MUST specify `skip_on_failure=True`. Note that `@root_validator` is deprecated and should be replaced with `@model_validator`.
pseudotensor commented 11 months ago

I'm sure it's same issue you originally had. You are using versions of pydantic, langchain etc. that are not compatible with h2oGPT. That particular commit has nothing to do with the error you see.

hpxiong commented 11 months ago

I'm sure it's same issue you originally had. You are using versions of pydantic, langchain etc. that are not compatible with h2oGPT. That particular commit has nothing to do with the error you see.

I agree this is because of the libraries. I tried to run setup steps again and no success. I will do a fresh start and try it one more time.

pseudotensor commented 11 months ago

We don't constrain pydantic and depend upon langchain to do that. Your pip install probably complains about ERRORS but it does nothing about it.

You can try doing pip uninstall -y pydantic ; pip install pydantic==1.10.13 and it may fix that particular problem.