OpenMOSS / CoLLiE

Collaborative Training of Large Language Models in an Efficient Way
https://openlmlab-collie.readthedocs.io
Apache License 2.0
410 stars 58 forks source link

trainer.py文件中保存peft_config时会出错 #159

Closed Mr-nnng closed 6 months ago

Mr-nnng commented 7 months ago

File ".../python3.10/site-packages/collie/controller/trainer.py", line 618, in save_peft json.dumps(peft_config.dict), File ".../python3.10/json/init.py", line 231, in dumps return _default_encoder.encode(obj) File ".../python3.10/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File ".../python3.10/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) File ".../python3.10/json/encoder.py", line 179, in default raise TypeError(f'Object of type {o.class.name} ' TypeError: Object of type set is not JSON serializable

Mr-nnng commented 7 months ago

config.peft_config: LoraConfig(peft_type=<PeftType.LORA: 'LORA'>, auto_mapping=None, base_model_name_or_path=None, revision=None, task_type=<TaskType.CAUSAL_LM: 'CAUSAL_LM'>, inference_mode=False, r=4, target_modules={'wqkv'}, lora_alpha=8, lora_dropout=0.05, fan_in_fan_out=False, bias='none', modules_to_save=None, init_lora_weights=True, layers_to_transform=None, layers_pattern=None, rank_pattern={}, alpha_pattern={}, megatron_config=None, megatron_core='megatron.core', loftq_config={})

这里target_modules是集合,用json.dumps会出错

KaiLv69 commented 6 months ago

抱歉回复比较晚,这个bug在PR https://github.com/OpenMOSS/CoLLiE/pull/166 里已经修复了。