Project-MONAI / research-contributions

Implementations of recent research prototypes/demonstrations using MONAI.
https://monai.io/
Apache License 2.0
1.01k stars 333 forks source link

Swin UNETR pretrained model weight has keys not compatible of load_from in fine-tuning task #76

Open upupming opened 2 years ago

upupming commented 2 years ago

Describe the bug

Swin UNETR pre-trained model weight has keys not compatible of load_from in fine-tuning task

To Reproduce Steps to reproduce the behavior:

  1. pre-train the Swin UNETR model
  2. call load_from using the pretrained model weights
  3. Will get error as the model weight keys are not compatible

This is the saved pretrained keys, it starts with swin_vit:

d440b0d431ea87921e6546c430842cb

But the load_from method requires weight keys to start with module: https://github.com/Project-MONAI/MONAI/blob/edf3b742a4ae85d1f30462ed0c7511c520fae888/monai/networks/nets/swin_unetr.py#L232

We can write an extra convert script to convert the pre-trained model to the desired format, but it is more convenient if the pre-trained model is saved as desired when pretraining.

ahatamiz commented 2 years ago

Please make sure to use the exact version of monai as described in the dependencies. Pre-trained weights are compatible if the right monai version is used.

ahatamiz commented 2 years ago

The pre-training utilizes ddp, hence module is added to weights name. If you use singe GPU for pre-training, which is not recommended, then the weight names need to be adjusted accordingly by the user.