是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
[X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
[X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
load_model_and_tokenizer from D:\LLM\models\LLM\OpenBMB\minicpm-v2.5
2024-05-28 23:14:15.301 Uncaught app exception
Traceback (most recent call last):
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script
exec(code, module.dict)
File "D:\LLM\MiniCPM-V\web_demo_streamlit-2_5.py", line 32, in
st.session_state.model, st.session_state.tokenizer = load_model_and_tokenizer()
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 165, in wrapper
return cached_func(*args, *kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 194, in call
return self._get_or_create_cached_value(args, kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 221, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 277, in _handle_cache_miss
computed_value = self._info.func(func_args, **func_kwargs)
File "D:\LLM\MiniCPM-V\web_demo_streamlit-2_5.py", line 25, in load_model_and_tokenizer
model = AutoModel.from_pretrained(model_path, trust_remote_code=True, torch_dtype=torch.float16).to(device="cuda")
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\models\auto\auto_factory.py", line 550, in from_pretrained
model_class = get_class_from_dynamic_module(
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\dynamic_module_utils.py", line 501, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\dynamic_module_utils.py", line 201, in get_class_in_module
module = importlib.machinery.SourceFileLoader(name, module_path).load_module()
File "", line 548, in _check_name_wrapper
File "", line 1063, in load_module
File "", line 888, in load_module
File "", line 290, in _load_module_shim
File "", line 719, in _load
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "C:\Users\31940.cache\huggingface\modules\transformers_modules\minicpm-v2.5\modeling_minicpmv.py", line 14, in
from .resampler import Resampler
ModuleNotFoundError: No module named 'transformers_modules.minicpm-v2'
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
load_model_and_tokenizer from D:\LLM\models\LLM\OpenBMB\minicpm-v2.5 2024-05-28 23:14:15.301 Uncaught app exception Traceback (most recent call last): File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "D:\LLM\MiniCPM-V\web_demo_streamlit-2_5.py", line 32, in
st.session_state.model, st.session_state.tokenizer = load_model_and_tokenizer()
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 165, in wrapper
return cached_func(*args, *kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 194, in call
return self._get_or_create_cached_value(args, kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 221, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 277, in _handle_cache_miss
computed_value = self._info.func(func_args, **func_kwargs)
File "D:\LLM\MiniCPM-V\web_demo_streamlit-2_5.py", line 25, in load_model_and_tokenizer
model = AutoModel.from_pretrained(model_path, trust_remote_code=True, torch_dtype=torch.float16).to(device="cuda")
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\models\auto\auto_factory.py", line 550, in from_pretrained
model_class = get_class_from_dynamic_module(
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\dynamic_module_utils.py", line 501, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module)
File "D:\LLM\MiniCPM-V\runtime\lib\site-packages\transformers\dynamic_module_utils.py", line 201, in get_class_in_module
module = importlib.machinery.SourceFileLoader(name, module_path).load_module()
File "", line 548, in _check_name_wrapper
File "", line 1063, in load_module
File "", line 888, in load_module
File "", line 290, in _load_module_shim
File "", line 719, in _load
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "C:\Users\31940.cache\huggingface\modules\transformers_modules\minicpm-v2.5\modeling_minicpmv.py", line 14, in
from .resampler import Resampler
ModuleNotFoundError: No module named 'transformers_modules.minicpm-v2'
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
备注 | Anything else?
No response