w1oves / Rein

[CVPR 2024] Official implement of <Stronger, Fewer, & Superior: Harnessing Vision Foundation Models for Domain Generalized Semantic Segmentation>
https://zxwei.site/rein
GNU General Public License v3.0
250 stars 21 forks source link

Checkpoints of various "head.pth" trained on frozen backbone of VFMs (Table 1) #13

Closed DevinCheung closed 7 months ago

DevinCheung commented 7 months ago

Hi~ Thanks for your great work! I am also curious about the performance on "Frozen backbone of VFMs" in Table 1. Could you please release these checkpoints? Thanks!

w1oves commented 7 months ago

Thank you for your interest! You can access the configuration, log, and head weight for the frozen DINOv2 at this GitHub release. Additional weights will be made available as my schedule permits. It's important to note that these experiments were conducted on an earlier version of the project, which may render the configuration less relevant. An updated implementation will be shared in due time. However, given that the Mask2Former is powered by the MMSegmentation library and DINOv2 is in a frozen state, the provided weights are expected to remain applicable.

DevinCheung commented 7 months ago

Thanks so much for your quick response!

DevinCheung commented 7 months ago

How to correctly load the "iter_40000_published.pth" checkpoint? When I tried to directly load the checkpoint using demo.ipynb, it outputted the bug below:

`The model and loaded state dict do not match exactly

unexpected key in source state_dict: meta, decode_head.query_embed.weight, decode_head.query_feat.weight

missing keys in source state_dict: backbone.cls_token, backbone.pos_embed, backbone.mask_token, backbone.patch_embed.proj.weight, backbone.patch_embed.proj.bias, backbone.blocks.0.norm1.weight, backbone.blocks.0.norm1.bias, backbone.blocks.0.attn.qkv.weight, backbone.blocks.0.attn.qkv.bias, backbone.blocks.0.attn.proj.weight, backbone.blocks.0.attn.proj.bias, backbone.blocks.0.ls1.gamma, backbone.blocks.0.norm2.weight, backbone.blocks.0.norm2.bias, backbone.blocks.0.mlp.fc1.weight, backbone.blocks.0.mlp.fc1.bias, backbone.blocks.0.mlp.fc2.weight, backbone.blocks.0.mlp.fc2.bias, backbone.blocks.0.ls2.gamma, backbone.blocks.1.norm1.weight, backbone.blocks.1.norm1.bias, backbone.blocks.1.attn.qkv.weight, backbone.blocks.1.attn.qkv.bias, backbone.blocks.1.attn.proj.weight, ...`

w1oves commented 7 months ago

This is caused by that the checkpoint is only the head weight and hasn't unnecessary pretrained backbone weights. Solution will be updated tomorrow.

w1oves commented 7 months ago

I've updated the code for easier evaluation. Now you can use this checkpoint with the following command:

python tools/test.py configs/dinov2/dinov2_mask2former_512x512_bs1x4.py /path/to/checkpoint --backbone checkpoints/dinov2_converted.pth

To generate the file "checkpoints/dinov2_converted.pth", please follow the instructions provided in the readme file. If you encounter any issues or need further assistance, feel free to ask!

DevinCheung commented 7 months ago

Thanks!

DevinCheung commented 7 months ago

Hi~ How to use the full checkpoint after training to do the inference on an unlabeled dataset, e.g., ACDC test set? Thanks!

In my understanding, I need to use the demo.ipynb?

w1oves commented 7 months ago

You can use test.py directly.

---- Replied Message ---- | From | Xin @.> | | Date | 03/25/2024 12:35 | | To | @.> | | Cc | WEI, @.>、State @.> | | Subject | Re: [w1oves/Rein] Checkpoints of various "head.pth" trained on frozen backbone of VFMs (Table 1) (Issue #13) |

Hi~ How to use the full checkpoint after training to do the inference on an unlabeled dataset, e.g., ACDC test set? Thanks!

In my understanding, I need to use the demo.ipynb?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you modified the open/close state.Message ID: @.***>

w1oves commented 7 months ago

Like this: ''' python tools/test.py configs/dinov2_citys2acdc/rein_dinov2_mask2former_1024x1024_bs4x2.py /path/to/checkpoint --backbone checkpoints/dinov2_converted_1024x1024.pth '''