Open shiroko98 opened 3 months ago
I'm evaluating with the officially supported tasks/models/datasets.
无
config: 仿照configs/eval_internlm_flames_chat.py judge_models = [ dict( type=HuggingFaceCausalLM, abbr='flames-scorer', path='opencompass/models/flames-scorer', tokenizer_path='opencompass/models/flames-scorer', model_kwargs=dict( trust_remote_code=True, device_map='auto', ), tokenizer_kwargs=dict( padding_side='left', truncation_side='left', use_fast=False, trust_remote_code=True, ), max_out_len=2048, max_seq_len=2048, batch_size=1, meta_template=_meta_template, run_cfg=dict(num_gpus=1, num_procs=1), end_str='<|im_end|>', generation_kwargs = {"eos_token_id": [2, 92542], "do_sample": True}, batch_padding=True, ) ]
Traceback (most recent call last): File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 441, in inferencer.run() File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 94, in run self._score(model_cfg, dataset_cfg, eval_cfg, output_column, File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 370, in _score icl_evaluator = ICL_EVALUATORS.build(eval_cfg['evaluator']) File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/registry.py", line 570, in build return self.build_func(cfg, args, kwargs, registry=self) File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg obj = obj_cls(args) # type: ignore File "/mnt/data/Codes/opencompass/opencompass/openicl/icl_evaluator/lm_evaluator.py", line 107, in init model = build_model_from_cfg(model_cfg=judge_cfg) File "/mnt/data/Codes/opencompass/opencompass/utils/build.py", line 25, in build_model_from_cfg return MODELS.build(model_cfg) File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/registry.py", line 570, in build return self.build_func(cfg, args, kwargs, registry=self) File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg obj = obj_cls(args) # type: ignore File "/mnt/data/Codes/opencompass/opencompass/models/huggingface.py", line 118, in init self._load_tokenizer(path=path, File "/mnt/data/Codes/opencompass/opencompass/models/huggingface.py", line 134, in _load_tokenizer self.tokenizer = AutoTokenizer.from_pretrained( File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 842, in from_pretrained raise ValueError( ValueError: Unrecognized configuration class <class 'transformers_modules.flames-scorer.configuration_internlm.InternLMConfig'> to build an AutoTokenizer. Model type should be one of AlbertConfig, AlignConfig, BarkConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BlipConfig, Blip2Config, BloomConfig, BridgeTowerConfig, BrosConfig, CamembertConfig, CanineConfig, ChineseCLIPConfig, ClapConfig, CLIPConfig, CLIPSegConfig, ClvpConfig, LlamaConfig, CodeGenConfig, ConvBertConfig, CpmAntConfig, CTRLConfig, Data2VecAudioConfig, Data2VecTextConfig, DebertaConfig, DebertaV2Config, DistilBertConfig, DPRConfig, ElectraConfig, ErnieConfig, ErnieMConfig, EsmConfig, FalconConfig, FastSpeech2ConformerConfig, FlaubertConfig, FNetConfig, FSMTConfig, FunnelConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GPTSanJapaneseConfig, GroupViTConfig, HubertConfig, IBertConfig, IdeficsConfig, InstructBlipConfig, JukeboxConfig, Kosmos2Config, LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config, LEDConfig, LiltConfig, LlamaConfig, LlavaConfig, LongformerConfig, LongT5Config, LukeConfig, LxmertConfig, M2M100Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MgpstrConfig, MistralConfig, MixtralConfig, MobileBertConfig, MPNetConfig, MptConfig, MraConfig, MT5Config, MusicgenConfig, MvpConfig, NezhaConfig, NllbMoeConfig, NystromformerConfig, OneFormerConfig, OpenAIGPTConfig, OPTConfig, Owlv2Config, OwlViTConfig, PegasusConfig, PegasusXConfig, PerceiverConfig, PersimmonConfig, PhiConfig, Pix2StructConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, RagConfig, RealmConfig, ReformerConfig, RemBertConfig, RetriBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SiglipConfig, Speech2TextConfig, Speech2Text2Config, SpeechT5Config, SplinterConfig, SqueezeBertConfig, SwitchTransformersConfig, T5Config, TapasConfig, TransfoXLConfig, TvpConfig, UMT5Config, ViltConfig, VipLlavaConfig, VisualBertConfig, VitsConfig, Wav2Vec2Config, Wav2Vec2BertConfig, Wav2Vec2ConformerConfig, WhisperConfig, XCLIPConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, YosoConfig. [2024-06-25 17:10:32,911] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 717540) of binary: /mnt/data/anaconda3/envs/opencompass/bin/python
根据官方(https://huggingface.co/CaasiHUANG/flames-scorer)的model card,似乎要通过 from tokenization_internlm import InternLMTokenizer from modeling_internlm import InternLMForSequenceClassification
tokenizer = InternLMTokenizer.from_pretrained("CaasiHUANG/flames-scorer", trust_remote_code=True) model = InternLMForSequenceClassification.from_pretrained("CaasiHUANG/flames-scorer", trust_remote_code=True) 才能加载
Is now the problem solved?
Prerequisite
Type
I'm evaluating with the officially supported tasks/models/datasets.
Environment
无
Reproduces the problem - code/configuration sample
config: 仿照configs/eval_internlm_flames_chat.py judge_models = [ dict( type=HuggingFaceCausalLM, abbr='flames-scorer', path='opencompass/models/flames-scorer', tokenizer_path='opencompass/models/flames-scorer', model_kwargs=dict( trust_remote_code=True, device_map='auto', ), tokenizer_kwargs=dict( padding_side='left', truncation_side='left', use_fast=False, trust_remote_code=True, ), max_out_len=2048, max_seq_len=2048, batch_size=1, meta_template=_meta_template, run_cfg=dict(num_gpus=1, num_procs=1), end_str='<|im_end|>', generation_kwargs = {"eos_token_id": [2, 92542], "do_sample": True}, batch_padding=True, ) ]
Reproduces the problem - command or script
无
Reproduces the problem - error message
Traceback (most recent call last): File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 441, in
inferencer.run()
File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 94, in run
self._score(model_cfg, dataset_cfg, eval_cfg, output_column,
File "/mnt/data/Codes/opencompass/opencompass/tasks/subjective_eval.py", line 370, in _score
icl_evaluator = ICL_EVALUATORS.build(eval_cfg['evaluator'])
File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/registry.py", line 570, in build
return self.build_func(cfg, args, kwargs, registry=self)
File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg
obj = obj_cls(args) # type: ignore
File "/mnt/data/Codes/opencompass/opencompass/openicl/icl_evaluator/lm_evaluator.py", line 107, in init
model = build_model_from_cfg(model_cfg=judge_cfg)
File "/mnt/data/Codes/opencompass/opencompass/utils/build.py", line 25, in build_model_from_cfg
return MODELS.build(model_cfg)
File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/registry.py", line 570, in build
return self.build_func(cfg, args, kwargs, registry=self)
File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg
obj = obj_cls(args) # type: ignore
File "/mnt/data/Codes/opencompass/opencompass/models/huggingface.py", line 118, in init
self._load_tokenizer(path=path,
File "/mnt/data/Codes/opencompass/opencompass/models/huggingface.py", line 134, in _load_tokenizer
self.tokenizer = AutoTokenizer.from_pretrained(
File "/mnt/data/anaconda3/envs/opencompass/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 842, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.flames-scorer.configuration_internlm.InternLMConfig'> to build an AutoTokenizer.
Model type should be one of AlbertConfig, AlignConfig, BarkConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BlipConfig, Blip2Config, BloomConfig, BridgeTowerConfig, BrosConfig, CamembertConfig, CanineConfig, ChineseCLIPConfig, ClapConfig, CLIPConfig, CLIPSegConfig, ClvpConfig, LlamaConfig, CodeGenConfig, ConvBertConfig, CpmAntConfig, CTRLConfig, Data2VecAudioConfig, Data2VecTextConfig, DebertaConfig, DebertaV2Config, DistilBertConfig, DPRConfig, ElectraConfig, ErnieConfig, ErnieMConfig, EsmConfig, FalconConfig, FastSpeech2ConformerConfig, FlaubertConfig, FNetConfig, FSMTConfig, FunnelConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GPTSanJapaneseConfig, GroupViTConfig, HubertConfig, IBertConfig, IdeficsConfig, InstructBlipConfig, JukeboxConfig, Kosmos2Config, LayoutLMConfig, LayoutLMv2Config, LayoutLMv3Config, LEDConfig, LiltConfig, LlamaConfig, LlavaConfig, LongformerConfig, LongT5Config, LukeConfig, LxmertConfig, M2M100Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MgpstrConfig, MistralConfig, MixtralConfig, MobileBertConfig, MPNetConfig, MptConfig, MraConfig, MT5Config, MusicgenConfig, MvpConfig, NezhaConfig, NllbMoeConfig, NystromformerConfig, OneFormerConfig, OpenAIGPTConfig, OPTConfig, Owlv2Config, OwlViTConfig, PegasusConfig, PegasusXConfig, PerceiverConfig, PersimmonConfig, PhiConfig, Pix2StructConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, RagConfig, RealmConfig, ReformerConfig, RemBertConfig, RetriBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SeamlessM4TConfig, SeamlessM4Tv2Config, SiglipConfig, Speech2TextConfig, Speech2Text2Config, SpeechT5Config, SplinterConfig, SqueezeBertConfig, SwitchTransformersConfig, T5Config, TapasConfig, TransfoXLConfig, TvpConfig, UMT5Config, ViltConfig, VipLlavaConfig, VisualBertConfig, VitsConfig, Wav2Vec2Config, Wav2Vec2BertConfig, Wav2Vec2ConformerConfig, WhisperConfig, XCLIPConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, YosoConfig.
[2024-06-25 17:10:32,911] torch.distributed.elastic.multiprocessing.api: [ERROR] failed (exitcode: 1) local_rank: 0 (pid: 717540) of binary: /mnt/data/anaconda3/envs/opencompass/bin/python
Other information
根据官方(https://huggingface.co/CaasiHUANG/flames-scorer)的model card,似乎要通过 from tokenization_internlm import InternLMTokenizer from modeling_internlm import InternLMForSequenceClassification
tokenizer = InternLMTokenizer.from_pretrained("CaasiHUANG/flames-scorer", trust_remote_code=True) model = InternLMForSequenceClassification.from_pretrained("CaasiHUANG/flames-scorer", trust_remote_code=True) 才能加载