datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
9.32k stars 1.08k forks source link

LLaMA3_1-8B-Instruct WebDemo 部署 打开Web页面报错 #235

Open vistar-terry opened 3 months ago

vistar-terry commented 3 months ago

2024-08-09_00-28-40 请问这个该怎么解决?

完整报错: Traceback (most recent call last): File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 589, in _run_script exec(code, module.__dict__) File "C:\Users\Vistar\Desktop\llma3.1\chatBot.py", line 30, in <module> tokenizer, model = get_model() ^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 168, in wrapper return cached_func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 197, in __call__ return self._get_or_create_cached_value(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 224, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 280, in _handle_cache_miss computed_value = self._info.func(*func_args, **func_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\Desktop\llma3.1\chatBot.py", line 25, in get_model model = AutoModelForCausalLM.from_pretrained(mode_name_or_path, rope_scaling=rope_scaling, torch_dtype=torch.bfloat16).cuda() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\transformers\models\auto\auto_factory.py", line 524, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 989, in from_pretrained return config_class.from_dict(config_dict, **unused_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\transformers\configuration_utils.py", line 772, in from_dict config = cls(**config_dict) ^^^^^^^^^^^^^^^^^^ File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 161, in __init__ self._rope_scaling_validation() File "C:\Users\Vistar\.conda\envs\py312cu121\Lib\site-packages\transformers\models\llama\configuration_llama.py", line 182, in _rope_scaling_validation raise ValueError( ValueError:rope_scalingmust be a dictionary with two fields,typeandfactor, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}

KMnO4-zx commented 3 months ago

transformers版本不对,请保持与教程版本一致

vistar-terry commented 3 months ago

对的呀,和教程一样 2024-08-09_21-13-54

KMnO4-zx commented 3 months ago

https://www.codewithgpu.com/i/datawhalechina/self-llm/self-llm-llama3.1

那你试一下上面那个镜像行不?也可能是 4.43.1

zhanghhyyaa commented 2 months ago

更新一下transformer 就好了 pip install --upgrade transformers