现在新版本xtuner增加了dispatch后,不支持chatglm3-6b的微调了吗
File "/mnt/afs/xtuner/xtuner/model/sft.py", line 93, in init
dispatch_modules(self.llm, use_varlen_attn=use_varlen_attn)
File "/mnt/afs/xtuner/xtuner/model/modules/dispatch/init.py", line 266, in dispatch_modules
check(type(model).name)
File "/mnt/afs/xtuner/xtuner/model/modules/dispatch/init.py", line 261, in check
assert TRANSFORMERS_VERSION >= LOWEST_TRANSFORMERS_VERSION[
KeyError: 'ChatGLMForConditionalGeneration'
现在新版本xtuner增加了dispatch后,不支持chatglm3-6b的微调了吗 File "/mnt/afs/xtuner/xtuner/model/sft.py", line 93, in init dispatch_modules(self.llm, use_varlen_attn=use_varlen_attn) File "/mnt/afs/xtuner/xtuner/model/modules/dispatch/init.py", line 266, in dispatch_modules check(type(model).name) File "/mnt/afs/xtuner/xtuner/model/modules/dispatch/init.py", line 261, in check assert TRANSFORMERS_VERSION >= LOWEST_TRANSFORMERS_VERSION[ KeyError: 'ChatGLMForConditionalGeneration'
看了一下确实没有chatglm LOWEST_TRANSFORMERS_VERSION = dict( InternLM2ForCausalLM=digit_version('4.36'), InternLMForCausalLM=digit_version('4.36'), LlamaForCausalLM=digit_version('4.36'), Phi3ForCausalLM=digit_version('4.39'), MistralForCausalLM=digit_version('4.36'),
Training mixtral with lower version may lead to nccl timeout
)