Open chenchenchen77 opened 4 weeks ago
HI! Thank you great work, I want to run the llama model but when I run v2, have a bug (Mistral can run success,I think maybe is the reason about .pt and .safetensors) Traceback (most recent call last): File "/data/test_baseline/baseline/icae/icae/code/icae_v2/fine_tuned_inference.py", line 33, in state_dict = load_file(training_args.output_dir) File "/data/miniconda3/envs/ultragist/lib/python3.10/site-packages/safetensors/torch.py", line 313, in load_file with safe_open(filename, framework="pt", device=device) as f: safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
Can you tell me how can solve this question ,thank you!
Have you solved the problem?
HI! Thank you great work, I want to run the llama model but when I run v2, have a bug (Mistral can run success,I think maybe is the reason about .pt and .safetensors) Traceback (most recent call last): File "/data/test_baseline/baseline/icae/icae/code/icae_v2/fine_tuned_inference.py", line 33, in
state_dict = load_file(training_args.output_dir)
File "/data/miniconda3/envs/ultragist/lib/python3.10/site-packages/safetensors/torch.py", line 313, in load_file
with safe_open(filename, framework="pt", device=device) as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
Can you tell me how can solve this question ,thank you!