baichuan-inc / Baichuan2

A series of large language models developed by Baichuan Intelligent Technology
https://huggingface.co/baichuan-inc
Apache License 2.0
4.08k stars 293 forks source link

Baichuan2-13B-Chat-4bits 跑不起来 #400

Open you567 opened 5 months ago

you567 commented 5 months ago

你好,我在windows下,使用torch-2.2.2+cu121+python3.10 ,然后安装了requirements.txt的环境,以下为代码:

import torch from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation.utils import GenerationConfig path="D:/pythonworkspace/MyPET/model/Baichuan2-13B-Chat-4bits/" tokenizer = AutoTokenizer.from_pretrained(path, revision="v2.0", use_fast=False, trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained(path, revision="v2.0", device_map="auto", torch_dtype=torch.bfloat16, trust_remote_code=True)

model.generation_config = GenerationConfig.from_pretrained(path, revision="v2.0") messages = [] messages.append({"role": "user", "content": "解释一下“温y故而知新”"}) response = model.chat(tokenizer, messages) print(response)

报错:

D:\program\anaconda\envs\lmdeploy\lib\site-packages\transformers\utils\generic.py:311: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. torch.utils._pytree._register_pytree_node( A matching Triton is not available, some optimizations will not be enabled Traceback (most recent call last): File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\xformers__init__.py", line 55, in _is_triton_available from xformers.triton.softmax import softmax as triton_softmax # noqa File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\xformers\triton\softmax.py", line 11, in import triton ModuleNotFoundError: No module named 'triton' Traceback (most recent call last): File "d:\pythonworkspace\MyPET\baichuan.py", line 13, in model = AutoModelForCausalLM.from_pretrained(path, File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\transformers\models\auto\auto_factory.py", line 558, in from_pretrained return model_class.from_pretrained( File "C:\Users\ASUS/.cache\huggingface\modules\transformers_modules\modeling_baichuan.py", line 664, in from_pretrained dispatch_model(model, device_map=device_map) File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\accelerate\big_modeling.py", line 351, in dispatch_model check_device_map(model, device_map) File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\accelerate\utils\modeling.py", line 1380, in check_device_map all_modeltensors = [name for name, in model.state_dict().items()] File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\torch\nn\modules\module.py", line 1895, in state_dict module.state_dict(destination=destination, prefix=prefix + name + '.', keep_vars=keep_vars) File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\torch\nn\modules\module.py", line 1895, in state_dict module.state_dict(destination=destination, prefix=prefix + name + '.', keep_vars=keep_vars) File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\torch\nn\modules\module.py", line 1895, in state_dict module.state_dict(destination=destination, prefix=prefix + name + '.', keep_vars=keep_vars) [Previous line repeated 2 more times] File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\torch\nn\modules\module.py", line 1892, in state_dict self._save_to_state_dict(destination, prefix, keep_vars) File "D:\program\anaconda\envs\lmdeploy\lib\site-packages\bitsandbytes\nn\modules.py", line 402, in _save_to_state_dict for k, v in self.weight.quant_state.as_dict(packed=True).items(): AttributeError: 'list' object has no attribute 'as_dict'

sunjinguo commented 5 months ago

安装包版本不兼容问题,我也出现过这个问题,解决方案:accelerate-0.25.0 、bitsandbytes-0.41.1

you567 commented 5 months ago

安装包版本不兼容的问题,我也出现过这个问题,解决方案:accelerate-0.25.0 、bitsandbytes-0.41.1

奥奥,好的,非常感谢!