SmartFlowAI / Llama3-Tutorial

Llama3-Tutorial(XTuner、LMDeploy、OpenCompass)
479 stars 48 forks source link

safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge #15

Closed ucsdzehualiu closed 4 months ago

ucsdzehualiu commented 4 months ago

load model begin. Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s] 2024-04-26 09:50:49.558 Uncaught app exception Traceback (most recent call last): File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script exec(code, module.dict) File "/root/Llama3-XTuner-CN/tools/internstudio_web_demo.py", line 274, in main(arg1) File "/root/Llama3-XTuner-CN/tools/internstudio_web_demo.py", line 222, in main model, tokenizer = load_model(arg1) File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 168, in wrapper return cached_func(*args, *kwargs) File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 197, in call return self._get_or_create_cached_value(args, kwargs) File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 224, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 280, in _handle_cache_miss computed_value = self._info.func(func_args, **func_kwargs) File "/root/Llama3-XTuner-CN/tools/internstudio_web_demo.py", line 174, in load_model model = AutoModelForCausalLM.from_pretrained(arg1, torch_dtype=torch.float16).cuda() File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained return model_class.from_pretrained( File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3677, in from_pretrained ) = cls._load_pretrained_model( File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4084, in _load_pretrained_model state_dict = load_state_dict(shard_file, is_quantized=is_quantized) File "/root/anaconda3/envs/llama3/lib/python3.10/site-packages/transformers/modeling_utils.py", line 507, in load_state_dict with safe_open(checkpoint_file, framework="pt") as f: safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge ^C Stopping...

ucsdzehualiu commented 4 months ago

I got this issue by following the depoly demo step by step

ucsdzehualiu commented 4 months ago

I figure it out by redownload the model,Fixed