THUDM / VisualGLM-6B

Chinese and English multimodal conversational language model | 多模态中英双语对话语言模型
Apache License 2.0
4.08k stars 415 forks source link

运行的webdemo.py的时候出现以下错误 #144

Closed wangzijian1010 closed 1 year ago

wangzijian1010 commented 1 year ago

运行环境 :

命令行输入python webdemo.py出现错误以下是错误的输出信息

[2023-06-27 15:16:09,645] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2023-06-27 15:16:09,959] [INFO] building VisualGLMModel model ... [2023-06-27 15:16:09,969] [INFO] [RANK 0] > initializing model parallel with size 1 [2023-06-27 15:16:09,970] [INFO] [RANK 0] You are using model-only mode. For torch.distributed users or loading model parallel models, set environment variables RANK, WORLD_SIZE and LOCAL_RANK. /home/ai-tets/miniconda3/envs/vgpt/lib/python3.10/site-packages/torch/nn/init.py:405: UserWarning: Initializing zero-element tensors is a no-op warnings.warn("Initializing zero-element tensors is a no-op") [2023-06-27 15:16:15,605] [INFO] [RANK 0] > number of parameters on model parallel rank 0: 7810582016 [2023-06-27 15:16:18,510] [INFO] [RANK 0] global rank 0 is loading checkpoint /home/ai-tets/.sat_models/visualglm-6b/1/mp_rank_00_model_states.pt [2023-06-27 15:16:24,839] [INFO] [RANK 0] > successfully loaded /home/ai-tets/.sat_models/visualglm-6b/1/mp_rank_00_model_states.pt Traceback (most recent call last): File "/home/ai-tets/wangzijian/VisualGLM-6B/web_demo.py", line 129, in <module> main(args) File "/home/ai-tets/wangzijian/VisualGLM-6B/web_demo.py", line 83, in main model, tokenizer = get_infer_setting(gpu_device=0, quant=args.quant) File "/home/ai-tets/wangzijian/VisualGLM-6B/model/infer_util.py", line 28, in get_infer_setting tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) File "/home/ai-tets/miniconda3/envs/vgpt/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 719, in from_pretrained raise ValueError( ValueError: Unrecognized configuration class <class 'transformers_modules.THUDM.chatglm-6b.1d240ba371910e9282298d4592532d7f0f3e9f3e.configuration_chatglm.ChatGLMConfig'> to build an AutoTokenizer. Model type should be one of AlbertConfig, AlignConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BlipConfig, Blip2Config, BloomConfig, BridgeTowerConfig, CamembertConfig, CanineConfig, ChineseCLIPConfig, ClapConfig, CLIPConfig, CLIPSegConfig, CodeGenConfig, ConvBertConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DebertaConfig, DebertaV2Config, DistilBertConfig, DPRConfig, ElectraConfig, ErnieConfig, ErnieMConfig, EsmConfig, FlaubertConfig, FNetConfig, FSMTConfig, FunnelConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GPTSanJapaneseConfig, GroupViTConfig, HubertConfig, IBertConfig, JukeboxConfig, LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config, LEDConfig, LiltConfig, LlamaConfig, LongformerConfig, LongT5Config, LukeConfig, LxmertConfig, M2M100Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MgpstrConfig, MobileBertConfig, MPNetConfig, MT5Config, MvpConfig, NezhaConfig, NllbMoeConfig, NystromformerConfig, OneFormerConfig, OpenAIGPTConfig, OPTConfig, OwlViTConfig, PegasusConfig, PegasusXConfig, PerceiverConfig, Pix2StructConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, RagConfig, RealmConfig, ReformerConfig, RemBertConfig, RetriBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2TextConfig, Speech2Text2Config, SpeechT5Config, SplinterConfig, SqueezeBertConfig, SwitchTransformersConfig, T5Config, TapasConfig, TransfoXLConfig, ViltConfig, VisualBertConfig, Wav2Vec2Config, Wav2Vec2ConformerConfig, WhisperConfig, XCLIPConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, YosoConfig.

希望大佬们能解答疑惑

Sleepychord commented 1 year ago

可能是transformers版本问题?

buzhihuoyefeng commented 1 year ago

transformers版本有什么特别注意的吗,我是python3.8,按照官方文档安装的依赖现在也不行

wangzijian1010 commented 1 year ago

可能是transformers版本问题?

谢谢回复 已经解决 应该是缺少了一些config文件 在HF repo中下载了 config.json ice_text.model以及tokenizer_config.json之后解决

wangzijian1010 commented 1 year ago

thon3.8,按照官 可能是少了一些配置文件 你可以将HF repo的config文件下到你项目的根目录

mMrBun commented 1 year ago

我也是这个错误,下载的文件放到哪里可以被加载啊?

wangzijian1010 commented 1 year ago

我也是这个错误,下载的文件放到哪里可以被加载啊?

一般是/home/xxx/.cache/huggingface/hub/models--THUDM--chatglm-6b/sbapshots/......中