Open cgq0816 opened 1 year ago
加载模型方式 config = GenerationConfig.from_pretrained( "/data/model/Baichuan-13B-Base" ) model = AutoModelForCausalLM.from_pretrained( "/data/model/Baichuan-13B-Base", torch_dtype=torch.float16, device_map="auto", trust_remote_code=True ) model.generation_config = config tokenizer = AutoTokenizer.from_pretrained( "/data/model/Baichuan-13B-Base", use_fast=False, trust_remote_code=True ) 也按照其他人issues的问题进行了配置 但是还是会报如下错误 请问我该如何使用Baichuan-13B-Base做问答呢?
我也报了相同的错,请问有解决方法吗~
同问,如何用basemodel做问答呢?
加载模型方式 config = GenerationConfig.from_pretrained( "/data/model/Baichuan-13B-Base" ) model = AutoModelForCausalLM.from_pretrained( "/data/model/Baichuan-13B-Base", torch_dtype=torch.float16, device_map="auto", trust_remote_code=True ) model.generation_config = config tokenizer = AutoTokenizer.from_pretrained( "/data/model/Baichuan-13B-Base", use_fast=False, trust_remote_code=True ) 也按照其他人issues的问题进行了配置 但是还是会报如下错误 请问我该如何使用Baichuan-13B-Base做问答呢?