MIC-DKFZ / nnUNet

Apache License 2.0
5.86k stars 1.75k forks source link

Ensembling across different trainers for a dataset. #2493

Open aymuos15 opened 1 month ago

aymuos15 commented 1 month ago

Hello!

If I have two trainers (say two different losses) -- How would I go about ensembling them for inference?

Basically, get the ensemble of all the folds for each trainer (which usually happens) and then ensemble across different trainers?

(Not sure if this is already covered elsewhere)

mrokuss commented 1 month ago

Hi @aymuos15

Tricky question, you could do several things. Usually, I would also go with the most straight forward way of ensembling the folds first and then the trainers. Now if the underlying model is exactly the same (so just different weights) you could also ensemble it as more folds for one model during inference. This might be faster than using 2 predictors for each model - in case you are time constrained. All in all, you’d have to verify what works best e.g. on an external validation set.

Best,

Max

aymuos15 commented 1 month ago

Hi!

Thank you very much for the suggestions!

I was more hoping -- there was a specific command to do this easily within the framework?

How would I go about doing it otherwise? Do I have to write a separate inference script by loading the models?

fitzjalen commented 1 month ago

Hi @aymuos15

Tricky question, you could do several things. Usually, I would also go with the most straight forward way of ensembling the folds first and then the trainers. Now if the underlying model is exactly the same (so just different weights) you could also ensemble it as more folds for one model during inference. This might be faster than using 2 predictors for each model - in case you are time constrained. All in all, you’d have to verify what works best e.g. on an external validation set.

Best,

Max

Would it works if I have models trained on the same dataset but with different trainers and plans?