clemsgrs / hipt

Re-implementation of HIPT
17 stars 7 forks source link

About vit4k_xs_dino.pth #13

Open AlexNmSED opened 11 months ago

AlexNmSED commented 11 months ago

Is the weight file vit4k_xs_dino.pth available? I am getting the error 77 weight(s) loaded succesfully ; 1 weight(s) not loaded because of mismatching shapes when I use the original HIPT weights for heatmaps visualization. Thank you for your help!

clemsgrs commented 10 months ago

hi, it seems you're loading all the weights fine but one. Which region size / patch size are you using?

AlexNmSED commented 10 months ago

Hi, thank you for your support. I use 4096 as the region size. The settings are the same as for HIPT.

clemsgrs commented 9 months ago

could you print the full error trace?

AlexNmSED commented 9 months ago

print the full error trace:

Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed.


[2023-12-22 09:36:09,276][torch.distributed.distributed_c10d][INFO] - Added key: store_based_barrier_key:1 to store for rank: 1 [2023-12-22 09:36:09,277][torch.distributed.distributed_c10d][INFO] - Added key: store_based_barrier_key:1 to store for rank: 0 [2023-12-22 09:36:09,278][torch.distributed.distributed_c10d][INFO] - Rank 0: Completed store-based barrier for key:store_based_barrier_key:1 with 2 nodes. Distributed session successfully initialized torch.cuda.device_count(): 8 [2023-12-22 09:36:09,287][torch.distributed.distributed_c10d][INFO] - Rank 1: Completed store-based barrier for key:store_based_barrier_key:1 with 2 nodes. Loading pretrained weights for patch-level Transformer... Loading pretrained weights for patch-level Transformer... Take key teacher in provided checkpoint dict Take key teacher in provided checkpoint dict Pretrained weights found at /n/archive00/labs/IT/GDC/xihao/hipt-master/checkpoints/vit_256_small_dino.pth 150 weight(s) loaded succesfully ; 0 weight(s) not loaded because of mismatching shapes Pretrained weights found at /n/archive00/labs/IT/GDC/xihao/hipt-master/checkpoints/vit_256_small_dino.pth 150 weight(s) loaded succesfully ; 0 weight(s) not loaded because of mismatching shapes Loading pretrained weights for region-level Transformer... Loading pretrained weights for region-level Transformer... Take key teacher in provided checkpoint dict Take key teacher in provided checkpoint dict Pretrained weights found at /n/archive00/labs/IT/GDC/xihao/hipt-master/checkpoints/vit4k_xs_dino.pth 77 weight(s) loaded succesfully ; 1 weight(s) not loaded because of mismatching shapes Pretrained weights found at /n/archive00/labs/IT/GDC/xihao/hipt-master/checkpoints/vit4k_xs_dino.pth 77 weight(s) loaded succesfully ; 1 weight(s) not loaded because of mismatching shapes

clemsgrs commented 9 months ago

my guess is that the positional embedding layer is not properly loaded. hard for me to know why ; could you paste the yaml config file you're using?

AlexNmSED commented 9 months ago

Thank you for your help.

I use the default config file. And the weight file comes from the original HIPT.