baichuan-inc / Baichuan-7B

A large-scale 7B pretraining language model developed by BaiChuan-Inc.
https://huggingface.co/baichuan-inc/baichuan-7B
Apache License 2.0
5.67k stars 506 forks source link

[Question] 推理调用报Unrecognized configuration class <class 'transformers_modules.baichuan-inc.baichuan-7B.39916f64eb892ccdc1982b0eef845b3b8fd43f6b.configuration_baichuan.BaiChuanConfig'> to build an AutoTokenizer #85

Open BigBen7 opened 1 year ago

BigBen7 commented 1 year ago

Required prerequisites

Questions

ValueError: Unrecognized configuration class <class 'transformers_modules.baichuan-inc.baichuan-7B.39916f64eb892ccdc1982b0eef845b3b8fd43f6b.configuration_baichuan.BaiChuanConfig'> to build an AutoTokenizer.

Model type should be one of AlbertConfig, AlignConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BlipConfig, Blip2Config, BloomConfig, BridgeTowerConfig, CamembertConfig, CanineConfig, ChineseCLIPConfig, ClapConfig, CLIPConfig, CLIPSegConfig, CodeGenConfig, ConvBertConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DebertaConfig, DebertaV2Config, DistilBertConfig, DPRConfig, ElectraConfig, ErnieConfig, ErnieMConfig, EsmConfig, FlaubertConfig, FNetConfig, FSMTConfig, FunnelConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GPTSanJapaneseConfig, GroupViTConfig, HubertConfig, IBertConfig, JukeboxConfig, LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config, LEDConfig, LiltConfig, LlamaConfig, LongformerConfig, LongT5Config, LukeConfig, LxmertConfig, M2M100Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MgpstrConfig, MobileBertConfig, MPNetConfig, MT5Config, MvpConfig, NezhaConfig, NllbMoeConfig, NystromformerConfig, OneFormerConfig, OpenAIGPTConfig, OPTConfig, OwlViTConfig, PegasusConfig, PegasusXConfig, PerceiverConfig, Pix2StructConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, RagConfig, RealmConfig, ReformerConfig, RemBertConfig, RetriBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2TextConfig, Speech2Text2Config, SpeechT5Config, SplinterConfig, SqueezeBertConfig, SwitchTransformersConfig, T5Config, TapasConfig, TransfoXLConfig, ViltConfig, VisualBertConfig, Wav2Vec2Config, Wav2Vec2ConformerConfig, WhisperConfig, XCLIPConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, YosoConfig.

代码 cache_dir = '**' tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/baichuan-7B", trust_remote_code=True, cache_dir=cache_dir) model = AutoModelForCausalLM.from_pretrained("baichuan-inc/baichuan-7B", device_map="auto", trust_remote_code=True, cache_dir=cache_dir) inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt') inputs = inputs.to('cuda:0') pred = model.generate(inputs, max_new_tokens=64,repetition_penalty=1.1) print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))

Checklist

Cat-L commented 1 year ago

+1

evi-Genius commented 1 year ago

same error

yanzihan1 commented 1 year ago

+1

ggsddu7 commented 1 year ago

check tokenizer_config.json exits in ~/.cache/huggingface/hub/models--baichuan-inc--baichuan-7B

Zcc commented 1 year ago

模型没下完整,记得挂代理

zdaoguang commented 1 year ago

+1

dragonbra commented 1 year ago

+1

dragonbra commented 1 year ago

image 解决了,我是重新安装了一下sentencepiece 这个包

rzou15 commented 1 year ago

+1

aleimu commented 1 year ago

我是baichuan2遇到了这样的问题,能尝试的方法都尝试了,模型文件应该不缺失,pip依赖的包版本什么的也是最新的

zhangxiann commented 12 months ago

+1