snexus / llm-search

Querying local documents, powered by LLM
MIT License
519 stars 60 forks source link

ValidationError: 1 validation error for Config llm -> params -> model_name field required (type=value_error.missing) #76

Closed MyraBaba closed 10 months ago

MyraBaba commented 11 months ago

Hi,

Sory to possible newbie question. I succesfully indexed the pfds and try to system . Downloaded advised airoboros-l2-13b-gpt4-1.4.1.Q4_K_M.gguf models and attached my config.

I have below errors.

config_template.yaml.txt

ValidationError: 1 validation error for Config llm -> params -> model_name field required (type=value_error.missing) Traceback: File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 530, in _run_script self._session_state.on_script_will_rerun(rerun_data.widget_states) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/state/safe_session_state.py", line 61, in on_script_will_rerun self._state.on_script_will_rerun(latest_widget_states) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 500, in on_script_will_rerun self._call_callbacks() File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 513, in _call_callbacks self._new_widget_state.call_callback(wid) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/state/session_state.py", line 260, in call_callback callback(*args, kwargs) File "/home/bc/Projects/OpenSource/llm-search/src/llmsearch/webapp.py", line 167, in reload_model config = load_config(config_file) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 212, in wrapper return cached_func(*args, *kwargs) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 241, in call return self._get_or_create_cached_value(args, kwargs) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 267, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) File "/home/bc/Projects/OpenSource/llm-search/venvLLMSearch/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 321, in _handle_cache_miss computed_value = self._info.func(func_args, func_kwargs) File "/home/bc/Projects/OpenSource/llm-search/src/llmsearch/webapp.py", line 112, in load_config return Config(**config_dict) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init

snexus commented 11 months ago

Hi,

For local models in .gguf format you need to use llamacpp as the mode type, like here - https://github.com/snexus/llm-search/blob/d0f756df9fae8ec8786550b0fdcd94c8306f5589/sample_templates/generic/config_template.yaml#L83

Please let me know if it works

snexus commented 10 months ago

If problem still persists, please reopen