xagentllama-34b-preview 使用transformer加载报错: model = AutoModelForCausalLM.from_pretrained( File "/data/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained return model_class.from_pretrained( File "/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3206, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in #397
Please provide a detailed description of the error or issue you encountered. / 请详细描述您遇到的错误或问题。
xagentllama-34b-preview , 模型包 不包含pytorch_model.bin.index.json, 使用transformer加载失败:
model = AutoModelForCausalLM.from_pretrained( File "/data/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained return model_class.from_pretrained( File "/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3206, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in
Issue Description / 问题描述
Please provide a detailed description of the error or issue you encountered. / 请详细描述您遇到的错误或问题。 xagentllama-34b-preview , 模型包 不包含pytorch_model.bin.index.json, 使用transformer加载失败: model = AutoModelForCausalLM.from_pretrained( File "/data/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained return model_class.from_pretrained( File "/envs_qwen_xgentllama_service/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3206, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in
模型包:共154G config.json pytorch_model-00012-of-00017.bin generation_config.json pytorch_model-00013-of-00017.bin pytorch_model-00001-of-00017.bin pytorch_model-00014-of-00017.bin pytorch_model-00002-of-00017.bin pytorch_model-00015-of-00017.bin pytorch_model-00003-of-00017.bin pytorch_model-00016-of-00017.bin pytorch_model-00004-of-00017.bin pytorch_model-00017-of-00017.bin pytorch_model-00005-of-00017.bin pytorch_model-00006-of-00017.bin README.md pytorch_model-00007-of-00017.bin special_tokens_map.json pytorch_model-00008-of-00017.bin tokenizer_config.json pytorch_model-00009-of-00017.bin tokenizer.json pytorch_model-00010-of-00017.bin tokenizer.model pytorch_model-00011-of-00017.bin
Steps to Reproduce / 复现步骤
Please provide the specific steps to reproduce the error. / 请提供复现错误的具体步骤。
代码: import os from transformers import AutoModelForCausalLM import torch from transformers import BitsAndBytesConfig model_name_or_path = "/data1/xagentllama-34b-preview/" tokenizer_name_or_path = model_name_or_path model = AutoModelForCausalLM.from_pretrained( model_name_or_path, local_files_only=True,# trust_remote_code=True, quantization_config = BitsAndBytesConfig(
量化数据类型设置
)
output = "/data1/lilei107/xagentllama-34b-preview-4bit" if not os.path.exists(output): os.mkdir(output)
model.save_pretrained(output) print("done") exit(0)
执行报错: OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in
发现模型包和huggingface 中都没有类似于 pytorch_model.bin.index.json 这个模型参数文件, 是否可以提供下这个文件?
谢谢!
Expected Behavior / 预期行为
Describe the behavior you expected to see. / 请描述您期望的正确行为。
Environment / 环境信息
Error Screenshots or Logs / 错误截图或日志
If possible, please provide relevant screenshots or logs of the error. / 如果可能,请提供相关的错误截图或日志文件。
Additional Notes / 其他备注
If you have any additional information or notes, please add them here. / 如果有其他补充信息,请在此处添加。