Open wathoresanket opened 11 months ago
Hi,
Can you explain more on the pretrained weights files and what changes we need to make accordingly? I could only find the weights for the swin-transformer and I'm not able to find it for others. Even so, after putting the files in a checkpoints folder, it is giving errors while running train_diffdet.py file. Pls help.
Thanks
@xyzlancehe pls help
Can you elaborate on the errors? Without a proper error message it is impossible to help.
Hi,
Can you explain more on the pretrained weights files and what changes we need to make accordingly? I could only find the weights for the swin-transformer and I'm not able to find it for others. Even so, after putting the files in a checkpoints folder, it is giving errors while running train_diffdet.py file. Pls help.
Thanks
@xyzlancehe pls help
You can download other weights from here - https://drive.google.com/drive/folders/1qD5m1NmK0kjE5hh-G17XUX751WsEG-h_
Sorry for not updating the status. I've downaloded all the pre-trained weights (diffdet, dino, yolo) and I'm currently training them. It's taking time. Will let know here if any issue arrises.
Hi, Can you explain more on the pretrained weights files and what changes we need to make accordingly? I could only find the weights for the swin-transformer and I'm not able to find it for others. Even so, after putting the files in a checkpoints folder, it is giving errors while running train_diffdet.py file. Pls help. Thanks @xyzlancehe pls help
Can you elaborate on the errors? Without a proper error message it is impossible to help.
I trained the model successfully but while running predict.py I'm getting unet error as follows:
UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:3190.)
return _VF.meshgrid(tensors, *kwargs) # type: ignore[attr-defined]
Some model parameters or buffers are not found in the checkpoint:
alphas_cumprod
alphas_cumprod_prev
backbone.bottom_up.norm0.{bias, weight}
backbone.bottom_up.norm1.{bias, weight}
backbone.bottom_up.norm2.{bias, weight}
backbone.bottom_up.norm3.{bias, weight}
backbone.fpn_lateral2.{bias, weight}
backbone.fpn_lateral3.{bias, weight}
backbone.fpn_lateral4.{bias, weight}
backbone.fpn_lateral5.{bias, weight}
backbone.fpn_output2.{bias, weight}
backbone.fpn_output3.{bias, weight}
backbone.fpn_output4.{bias, weight}
backbone.fpn_output5.{bias, weight}
betas
head.head_series.0.bboxes_delta.{bias, weight}
head.head_series.0.block_time_mlp.1.{bias, weight}
head.head_series.0.class_logits.{bias, weight}
head.head_series.0.cls_module.0.weight
head.head_series.0.cls_module.1.{bias, weight}
head.head_series.0.inst_interact.dynamic_layer.{bias, weight}
head.head_series.0.inst_interact.norm1.{bias, weight}
head.head_series.0.inst_interact.norm2.{bias, weight}
head.head_series.0.inst_interact.norm3.{bias, weight}
head.head_series.0.inst_interact.out_layer.{bias, weight}
head.head_series.0.linear1.{bias, weight}
head.head_series.0.linear2.{bias, weight}
head.head_series.0.norm1.{bias, weight}
head.head_series.0.norm2.{bias, weight}
head.head_series.0.norm3.{bias, weight}
head.head_series.0.reg_module.0.weight
head.head_series.0.reg_module.1.{bias, weight}
head.head_series.0.reg_module.3.weight
head.head_series.0.reg_module.4.{bias, weight}
head.head_series.0.reg_module.6.weight
head.head_series.0.reg_module.7.{bias, weight}
head.head_series.0.self_attn.out_proj.{bias, weight}
head.head_series.0.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.1.bboxes_delta.{bias, weight}
head.head_series.1.block_time_mlp.1.{bias, weight}
head.head_series.1.class_logits.{bias, weight}
head.head_series.1.cls_module.0.weight
head.head_series.1.cls_module.1.{bias, weight}
head.head_series.1.inst_interact.dynamic_layer.{bias, weight}
head.head_series.1.inst_interact.norm1.{bias, weight}
head.head_series.1.inst_interact.norm2.{bias, weight}
head.head_series.1.inst_interact.norm3.{bias, weight}
head.head_series.1.inst_interact.out_layer.{bias, weight}
head.head_series.1.linear1.{bias, weight}
head.head_series.1.linear2.{bias, weight}
head.head_series.1.norm1.{bias, weight}
head.head_series.1.norm2.{bias, weight}
head.head_series.1.norm3.{bias, weight}
head.head_series.1.reg_module.0.weight
head.head_series.1.reg_module.1.{bias, weight}
head.head_series.1.reg_module.3.weight
head.head_series.1.reg_module.4.{bias, weight}
head.head_series.1.reg_module.6.weight
head.head_series.1.reg_module.7.{bias, weight}
head.head_series.1.self_attn.out_proj.{bias, weight}
head.head_series.1.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.2.bboxes_delta.{bias, weight}
head.head_series.2.block_time_mlp.1.{bias, weight}
head.head_series.2.class_logits.{bias, weight}
head.head_series.2.cls_module.0.weight
head.head_series.2.cls_module.1.{bias, weight}
head.head_series.2.inst_interact.dynamic_layer.{bias, weight}
head.head_series.2.inst_interact.norm1.{bias, weight}
head.head_series.2.inst_interact.norm2.{bias, weight}
head.head_series.2.inst_interact.norm3.{bias, weight}
head.head_series.2.inst_interact.out_layer.{bias, weight}
head.head_series.2.linear1.{bias, weight}
head.head_series.2.linear2.{bias, weight}
head.head_series.2.norm1.{bias, weight}
head.head_series.2.norm2.{bias, weight}
head.head_series.2.norm3.{bias, weight}
head.head_series.2.reg_module.0.weight
head.head_series.2.reg_module.1.{bias, weight}
head.head_series.2.reg_module.3.weight
head.head_series.2.reg_module.4.{bias, weight}
head.head_series.2.reg_module.6.weight
head.head_series.2.reg_module.7.{bias, weight}
head.head_series.2.self_attn.out_proj.{bias, weight}
head.head_series.2.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.3.bboxes_delta.{bias, weight}
head.head_series.3.block_time_mlp.1.{bias, weight}
head.head_series.3.class_logits.{bias, weight}
head.head_series.3.cls_module.0.weight
head.head_series.3.cls_module.1.{bias, weight}
head.head_series.3.inst_interact.dynamic_layer.{bias, weight}
head.head_series.3.inst_interact.norm1.{bias, weight}
head.head_series.3.inst_interact.norm2.{bias, weight}
head.head_series.3.inst_interact.norm3.{bias, weight}
head.head_series.3.inst_interact.out_layer.{bias, weight}
head.head_series.3.linear1.{bias, weight}
head.head_series.3.linear2.{bias, weight}
head.head_series.3.norm1.{bias, weight}
head.head_series.3.norm2.{bias, weight}
head.head_series.3.norm3.{bias, weight}
head.head_series.3.reg_module.0.weight
head.head_series.3.reg_module.1.{bias, weight}
head.head_series.3.reg_module.3.weight
head.head_series.3.reg_module.4.{bias, weight}
head.head_series.3.reg_module.6.weight
head.head_series.3.reg_module.7.{bias, weight}
head.head_series.3.self_attn.out_proj.{bias, weight}
head.head_series.3.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.4.bboxes_delta.{bias, weight}
head.head_series.4.block_time_mlp.1.{bias, weight}
head.head_series.4.class_logits.{bias, weight}
head.head_series.4.cls_module.0.weight
head.head_series.4.cls_module.1.{bias, weight}
head.head_series.4.inst_interact.dynamic_layer.{bias, weight}
head.head_series.4.inst_interact.norm1.{bias, weight}
head.head_series.4.inst_interact.norm2.{bias, weight}
head.head_series.4.inst_interact.norm3.{bias, weight}
head.head_series.4.inst_interact.out_layer.{bias, weight}
head.head_series.4.linear1.{bias, weight}
head.head_series.4.linear2.{bias, weight}
head.head_series.4.norm1.{bias, weight}
head.head_series.4.norm2.{bias, weight}
head.head_series.4.norm3.{bias, weight}
head.head_series.4.reg_module.0.weight
head.head_series.4.reg_module.1.{bias, weight}
head.head_series.4.reg_module.3.weight
head.head_series.4.reg_module.4.{bias, weight}
head.head_series.4.reg_module.6.weight
head.head_series.4.reg_module.7.{bias, weight}
head.head_series.4.self_attn.out_proj.{bias, weight}
head.head_series.4.self_attn.{in_proj_bias, in_proj_weight}
head.head_series.5.bboxes_delta.{bias, weight}
head.head_series.5.block_time_mlp.1.{bias, weight}
head.head_series.5.class_logits.{bias, weight}
head.head_series.5.cls_module.0.weight
head.head_series.5.cls_module.1.{bias, weight}
head.head_series.5.inst_interact.dynamic_layer.{bias, weight}
head.head_series.5.inst_interact.norm1.{bias, weight}
head.head_series.5.inst_interact.norm2.{bias, weight}
head.head_series.5.inst_interact.norm3.{bias, weight}
head.head_series.5.inst_interact.out_layer.{bias, weight}
head.head_series.5.linear1.{bias, weight}
head.head_series.5.linear2.{bias, weight}
head.head_series.5.norm1.{bias, weight}
head.head_series.5.norm2.{bias, weight}
head.head_series.5.norm3.{bias, weight}
head.head_series.5.reg_module.0.weight
head.head_series.5.reg_module.1.{bias, weight}
head.head_series.5.reg_module.3.weight
head.head_series.5.reg_module.4.{bias, weight}
head.head_series.5.reg_module.6.weight
head.head_series.5.reg_module.7.{bias, weight}
head.head_series.5.self_attn.out_proj.{bias, weight}
head.head_series.5.self_attn.{in_proj_bias, in_proj_weight}
head.time_mlp.1.{bias, weight}
head.time_mlp.3.{bias, weight}
log_one_minus_alphas_cumprod
posterior_log_variance_clipped
posterior_mean_coef1
posterior_mean_coef2
posterior_variance
sqrt_alphas_cumprod
sqrt_one_minus_alphas_cumprod
sqrt_recip_alphas_cumprod
sqrt_recipm1_alphas_cumprod
The checkpoint state_dict contains keys that are not used by the model:
layers.0.blocks.1.attn_mask
layers.1.blocks.1.attn_mask
layers.2.blocks.1.attn_mask
layers.2.blocks.11.attn_mask
layers.2.blocks.13.attn_mask
layers.2.blocks.15.attn_mask
layers.2.blocks.17.attn_mask
layers.2.blocks.3.attn_mask
layers.2.blocks.5.attn_mask
layers.2.blocks.7.attn_mask
layers.2.blocks.9.attn_mask
norm.{bias, weight}
head.{bias, weight}
use_checkpoint!!!!!!!!!!!!!!!!!!!!!!!!
Traceback (most recent call last):
File "/mnt/DATA/EE20B041/Desktop/Dentex_SegAndDet/predict.py", line 552, in
Hello,
Thank you for your work on this project. Would it be possible to restore access to these pretrained model weights or provide updated links? Having access to pretrained weights would really help me hit the ground running with the project.
checkpoints/dino_pretrained_checkpoint0033_4scale.pth checkpoints/dino_pretrained_checkpoint0029_4scale_swin.pth
@xyzlancehe @vgthengane
Hello,
Thank you for your work on this project. Would it be possible to restore access to these pretrained model weights or provide updated links? Having access to pretrained weights would really help me hit the ground running with the project.
checkpoints/dino_pretrained_checkpoint0033_4scale.pth checkpoints/dino_pretrained_checkpoint0029_4scale_swin.pth
@xyzlancehe @vgthengane
The problem is this: The code uses a random train/validation split. If anyone shares his/her model weights but you use your own split, the results get distorted. It is possible to execute the listed commands and you'll receive your own trained models.
Hi,
Can you explain more on the pretrained weights files and what changes we need to make accordingly? I could only find the weights for the swin-transformer and I'm not able to find it for others. Even so, after putting the files in a checkpoints folder, it is giving errors while running train_diffdet.py file. Pls help.
Thanks
@xyzlancehe pls help