Closed linrl3 closed 1 year ago
报错如下:
p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat" File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/__init__.py", line 795, in pipeline **model_kwargs, File "/data01/nlp/.env/lib/python3.7/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model FlagAlpha/Llama2-Chinese-13b-Chat with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForCausalLM'>, <class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>).
运行代码如下:
from transformers import pipeline p = pipeline(task="text-generation", model="FlagAlpha/Llama2-Chinese-13b-Chat" )
请问有什么解决思路吗
不能直接指定model="FlagAlpha/Llama2-Chinese-13b-Chat", 需要把模型下载带本地,然后model换成本地目录
报错如下:
运行代码如下:
请问有什么解决思路吗