lightyear-turing / TuringMM-34B-Chat

Apache License 2.0
9 stars 1 forks source link

'TuringMMConfig' object has no attribute 'mlp_bias'. Did you mean: 'no_bias'? #2

Open Wmp0720 opened 3 months ago

Wmp0720 commented 3 months ago

self.tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False,trust_remote_code=True) self.model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto",torch_dtype=torch.float16, trust_remote_code=True) self.model.generation_config = GenerationConfig.from_pretrained(model_path) 这里的self.model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto",torch_dtype=torch.float16, trust_remote_code=True) 运行后会报错: …… self.model = AutoModelForCausalLM.from_pretrained("/home/4T/xumeijia/wangmeiping/TuringMM-34B-Chat/TuringMM-34B-Chat", device_map="auto",torch_dtype=torch.float16, trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ …… ^^^^^^^^^^^^^^^^ File "……/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 193, in init self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size, bias=config.mlp_bias) ^^^^^^^^^^^^^^^ File "……/lib/python3.12/site-packages/transformers/configuration_utils.py", line 264, in getattribute return super().getattribute(key) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'TuringMMConfig' object has no attribute 'mlp_bias'. Did you mean: 'no_bias'? 请问我应该如何解决?可以帮我看一下吗? (模型参数是下载在服务器本地上,该代码更改模型参数地址后,在其他模型上是可以运行成功的)

Wmp0720 commented 3 months ago

按照python run_chat_web.py --checkpoint_path '/your-model-path' 运行run_chat_web.py也是报同样的错。