feifeiobama / OrthogonalDet

[CVPR 2024] Exploring Orthogonality in Open World Object Detection
28 stars 3 forks source link

ValueError: loaded state dict has a different number of parameter groups #8

Open Melleaa opened 1 month ago

Melleaa commented 1 month ago

Dear author: After I finish the task t1, I got this error :ValueError: loaded state dict has a different number of parameter groups. I've tried your solution in #2 , changing the text model_final.pth to model_0019999.pth,but I still got this error. I checked my log and found a warning: fvcore.common.checkpoint WARNING: The checkpoint state_dict contains keys that are not used by the model: stem.fc.{bias, weight} This occurred at the beginning of task1, I wonder if this impacts? There were also warning when running task t2 Here's the total information:

[08/08 19:05:53 fvcore.common.checkpoint]: [Checkpointer] Loading from output/M-OWODB/model_0019999.pth ...
WARNING [08/08 19:05:54 fvcore.common.checkpoint]: Some model parameters or buffers are not found in the checkpoint:
head.head_series.0.calibration.0.{bias, weight}
head.head_series.1.calibration.0.{bias, weight}
head.head_series.2.calibration.0.{bias, weight}
head.head_series.3.calibration.0.{bias, weight}
head.head_series.4.calibration.0.{bias, weight}
head.head_series.5.calibration.0.{bias, weight}
[08/08 19:05:54 fvcore.common.checkpoint]: Loading trainer from output/M-OWODB/model_0019999.pth ...
[08/08 19:05:54 d2.engine.hooks]: Loading scheduler from state_dict ...
Traceback (most recent call last):
  File "train_net.py", line 281, in <module>
    launch(
  File "/raid/cjp/detectron2/detectron2/engine/launch.py", line 84, in launch
    main_func(*args)
  File "train_net.py", line 272, in main
    trainer.resume_or_load(resume=args.resume)
  File "/raid/cjp/detectron2/detectron2/engine/defaults.py", line 414, in resume_or_load
    self.checkpointer.resume_or_load(self.cfg.MODEL.WEIGHTS, resume=resume)
  File "/home/dl/anaconda3/envs/cjp/lib/python3.8/site-packages/fvcore/common/checkpoint.py", line 225, in resume_or_load
    return self.load(path)
  File "/raid/cjp/detectron2/detectron2/checkpoint/detection_checkpoint.py", line 62, in load
    ret = super().load(path, *args, **kwargs)
  File "/home/dl/anaconda3/envs/cjp/lib/python3.8/site-packages/fvcore/common/checkpoint.py", line 166, in load
    obj.load_state_dict(checkpoint.pop(key))
  File "/raid/cjp/detectron2/detectron2/engine/defaults.py", line 507, in load_state_dict
    self._trainer.load_state_dict(state_dict["_trainer"])
  File "/raid/cjp/detectron2/detectron2/engine/train_loop.py", line 430, in load_state_dict
    self.optimizer.load_state_dict(state_dict["optimizer"])
  File "/home/dl/anaconda3/envs/cjp/lib/python3.8/site-packages/torch/optim/optimizer.py", line 196, in load_state_dict
    raise ValueError("loaded state dict has a different number of "
ValueError: loaded state dict has a different number of parameter groups

/

Melleaa commented 1 month ago

Here's the total warning when I run t1,I wonder what's the question:

WARNING [08/08 20:59:59 fvcore.common.checkpoint]: Some model parameters or buffers are not found in the checkpoint:
alphas_cumprod
alphas_cumprod_prev
backbone.fpn_lateral2.{bias, weight}
backbone.fpn_lateral3.{bias, weight}
backbone.fpn_lateral4.{bias, weight}
backbone.fpn_lateral5.{bias, weight}
backbone.fpn_output2.{bias, weight}
backbone.fpn_output3.{bias, weight}
backbone.fpn_output4.{bias, weight}
backbone.fpn_output5.{bias, weight}
betas
head.head_series.0.bboxes_delta.{bias, weight}
head.head_series.0.class_logits.{bias, weight}
head.head_series.0.cls_module.0.weight
head.head_series.0.cls_module.1.{bias, weight}
head.head_series.0.inst_interact.dynamic_layer.{bias, weight}
head.head_series.0.inst_interact.norm1.{bias, weight}
head.head_series.0.inst_interact.norm2.{bias, weight}
head.head_series.0.inst_interact.norm3.{bias, weight}
head.head_series.0.inst_interact.out_layer.{bias, weight}
head.head_series.0.linear1.{bias, weight}
head.head_series.0.linear2.{bias, weight}
head.head_series.0.norm1.{bias, weight}
head.head_series.0.norm2.{bias, weight}
head.head_series.0.norm3.{bias, weight}
head.head_series.0.norm4.{bias, weight}
head.head_series.0.object_logit.{bias, running_mean, running_var, weight}
head.head_series.0.reg_module.0.weight
head.head_series.0.reg_module.1.{bias, weight}
head.head_series.0.reg_module.3.weight
head.head_series.0.reg_module.4.{bias, weight}
head.head_series.0.reg_module.6.weight
head.head_series.0.reg_module.7.{bias, weight}
head.head_series.0.self_attn.out_proj.{bias, weight}
head.head_series.0.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.1.bboxes_delta.{bias, weight}
head.head_series.1.class_logits.{bias, weight}
head.head_series.1.cls_module.0.weight
head.head_series.1.cls_module.1.{bias, weight}
head.head_series.1.inst_interact.dynamic_layer.{bias, weight}
head.head_series.1.inst_interact.norm1.{bias, weight}
head.head_series.1.inst_interact.norm2.{bias, weight}
head.head_series.1.inst_interact.norm3.{bias, weight}
head.head_series.1.inst_interact.out_layer.{bias, weight}
head.head_series.1.linear1.{bias, weight}
head.head_series.1.linear2.{bias, weight}
head.head_series.1.norm1.{bias, weight}
head.head_series.1.norm2.{bias, weight}
head.head_series.1.norm3.{bias, weight}
head.head_series.1.norm4.{bias, weight}
head.head_series.1.object_logit.{bias, running_mean, running_var, weight}
head.head_series.1.reg_module.0.weight
head.head_series.1.reg_module.1.{bias, weight}
head.head_series.1.reg_module.3.weight
head.head_series.1.reg_module.4.{bias, weight}
head.head_series.1.reg_module.6.weight
head.head_series.1.reg_module.7.{bias, weight}
head.head_series.1.self_attn.out_proj.{bias, weight}
head.head_series.1.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.2.bboxes_delta.{bias, weight}
head.head_series.2.class_logits.{bias, weight}
head.head_series.2.cls_module.0.weight
head.head_series.2.cls_module.1.{bias, weight}
head.head_series.2.inst_interact.dynamic_layer.{bias, weight}
head.head_series.2.inst_interact.norm1.{bias, weight}
head.head_series.2.inst_interact.norm2.{bias, weight}
head.head_series.2.inst_interact.norm3.{bias, weight}
head.head_series.2.inst_interact.out_layer.{bias, weight}
head.head_series.2.linear1.{bias, weight}
head.head_series.2.linear2.{bias, weight}
head.head_series.2.norm1.{bias, weight}
head.head_series.2.norm2.{bias, weight}
head.head_series.2.norm3.{bias, weight}
head.head_series.2.norm4.{bias, weight}
head.head_series.2.object_logit.{bias, running_mean, running_var, weight}
head.head_series.2.reg_module.0.weight
head.head_series.2.reg_module.1.{bias, weight}
head.head_series.2.reg_module.3.weight
head.head_series.2.reg_module.4.{bias, weight}
head.head_series.2.reg_module.6.weight
head.head_series.2.reg_module.7.{bias, weight}
head.head_series.2.self_attn.out_proj.{bias, weight}
head.head_series.2.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.3.bboxes_delta.{bias, weight}
head.head_series.3.class_logits.{bias, weight}
head.head_series.3.cls_module.0.weight
head.head_series.3.cls_module.1.{bias, weight}
head.head_series.3.inst_interact.dynamic_layer.{bias, weight}
head.head_series.3.inst_interact.norm1.{bias, weight}
head.head_series.3.inst_interact.norm2.{bias, weight}
head.head_series.3.inst_interact.norm3.{bias, weight}
head.head_series.3.inst_interact.out_layer.{bias, weight}
head.head_series.3.linear1.{bias, weight}
head.head_series.3.linear2.{bias, weight}
head.head_series.3.norm1.{bias, weight}
head.head_series.3.norm2.{bias, weight}
head.head_series.3.norm3.{bias, weight}
head.head_series.3.norm4.{bias, weight}
head.head_series.3.object_logit.{bias, running_mean, running_var, weight}
head.head_series.3.reg_module.0.weight
head.head_series.3.reg_module.1.{bias, weight}
head.head_series.3.reg_module.3.weight
head.head_series.3.reg_module.4.{bias, weight}
head.head_series.3.reg_module.6.weight
head.head_series.3.reg_module.7.{bias, weight}
head.head_series.3.self_attn.out_proj.{bias, weight}
head.head_series.3.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.4.bboxes_delta.{bias, weight}
head.head_series.4.class_logits.{bias, weight}
head.head_series.4.cls_module.0.weight
head.head_series.4.cls_module.1.{bias, weight}
head.head_series.4.inst_interact.dynamic_layer.{bias, weight}
head.head_series.4.inst_interact.norm1.{bias, weight}
head.head_series.4.inst_interact.norm2.{bias, weight}
head.head_series.4.inst_interact.norm3.{bias, weight}
head.head_series.4.inst_interact.out_layer.{bias, weight}
head.head_series.4.linear1.{bias, weight}
head.head_series.4.linear2.{bias, weight}
head.head_series.4.norm1.{bias, weight}
head.head_series.4.norm2.{bias, weight}
head.head_series.4.norm3.{bias, weight}
head.head_series.4.norm4.{bias, weight}
head.head_series.4.object_logit.{bias, running_mean, running_var, weight}
head.head_series.4.reg_module.0.weight
head.head_series.4.reg_module.1.{bias, weight}
head.head_series.4.reg_module.3.weight
head.head_series.4.reg_module.4.{bias, weight}
head.head_series.4.reg_module.6.weight
head.head_series.4.reg_module.7.{bias, weight}
head.head_series.4.self_attn.out_proj.{bias, weight}
head.head_series.4.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.5.bboxes_delta.{bias, weight}
head.head_series.5.class_logits.{bias, weight}
head.head_series.5.cls_module.0.weight
head.head_series.5.cls_module.1.{bias, weight}
head.head_series.5.inst_interact.dynamic_layer.{bias, weight}
head.head_series.5.inst_interact.norm1.{bias, weight}
head.head_series.5.inst_interact.norm2.{bias, weight}
head.head_series.5.inst_interact.norm3.{bias, weight}
head.head_series.5.inst_interact.out_layer.{bias, weight}
head.head_series.5.linear1.{bias, weight}
head.head_series.5.linear2.{bias, weight}
head.head_series.5.norm1.{bias, weight}
head.head_series.5.norm2.{bias, weight}
head.head_series.5.norm3.{bias, weight}
head.head_series.5.norm4.{bias, weight}
head.head_series.5.object_logit.{bias, running_mean, running_var, weight}
head.head_series.5.reg_module.0.weight
head.head_series.5.reg_module.1.{bias, weight}
head.head_series.5.reg_module.3.weight
head.head_series.5.reg_module.4.{bias, weight}
head.head_series.5.reg_module.6.weight
head.head_series.5.reg_module.7.{bias, weight}
head.head_series.5.self_attn.out_proj.{bias, weight}
head.head_series.5.self_attn.{in_proj_bias, in_proj_weight}
log_one_minus_alphas_cumprod
posterior_log_variance_clipped
posterior_mean_coef1
posterior_mean_coef2
posterior_variance
sqrt_alphas_cumprod
sqrt_one_minus_alphas_cumprod
sqrt_recip_alphas_cumprod
sqrt_recipm1_alphas_cumprod
WARNING [08/08 20:59:59 fvcore.common.checkpoint]: The checkpoint state_dict contains keys that are not used by the model:
  stem.fc.{bias, weight}
feifeiobama commented 1 month ago

I've checked my log and found the same warning, so I think the error is likely caused by something else. Meanwhile, please try deleting the current checkpoint folder and re-running the experiments for Task 1 and Task 2.

Melleaa commented 1 month ago

Thank you for your reply! But I want to clarify if you mean deleting the output folder and then retrying? I have already tried that, and it didn't work. Or do you mean deleting the last_checkpoint file? Or should I delete the contents inside last_checkpoint?

feifeiobama commented 1 month ago

I meant deleting the output folder and then retrying. If that doesn't work, maybe you can download a checkpoint from the provided link and modify the last_checkpoint file to specify that checkpoint for resuming.

Melleaa commented 1 month ago

OK! Thank you! I'll try.