LlamaFamily / Llama-Chinese

Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
https://llama.family
13.78k stars 1.24k forks source link

Could not load model {model} with any of the following classes: {class_tuple} #166

Closed linrl3 closed 1 year ago

linrl3 commented 1 year ago

报错如下:

    p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat"
  File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/__init__.py", line 795, in pipeline
    **model_kwargs,
  File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model
    raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model FlagAlpha/Llama2-Chinese-13b-Chat with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForCausalLM'>, <class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>).

运行代码如下:

from transformers import pipeline
p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat"
                 )

请问有什么解决思路吗

Rayrtfr commented 1 year ago

报错如下:

    p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat"
  File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/__init__.py", line 795, in pipeline
    **model_kwargs,
  File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model
    raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model FlagAlpha/Llama2-Chinese-13b-Chat with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForCausalLM'>, <class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>).

运行代码如下:

from transformers import pipeline
p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat"
                 )

请问有什么解决思路吗

不能直接指定model="FlagAlpha/Llama2-Chinese-13b-Chat", 需要把模型下载带本地,然后model换成本地目录