MIC-DKFZ / nnUNet

Apache License 2.0
5.97k stars 1.77k forks source link

How to change num_epoch? #1384

Closed pongopal closed 1 year ago

pongopal commented 1 year ago

I tried to change self.num_epoch = 100 inside nnUNetTrainer.py However it stills exceeds 100 and probably going to 1000 epoch as the default. how do I change the epoch? Thanks

rongzhao-zhang commented 1 year ago

The trainer you are using may not be nnUNetTrainer itself, the default is nnUNetTrainerV2, which inherits nnUNetTrainer, and the num_epoch attribute is overwritten by the child class, check this. You should change the num_epoch in the trainer class that you actually use, rather than in its superclass

pongopal commented 1 year ago

However nnunetv2 does not have this network_training/nnUNetTrainerV2

pongopal commented 1 year ago

Can you show any sample where do I have to change epoch?

LouisDo2108 commented 1 year ago

I am using the latest nnunetv2 release and modify "self.num_epoch = 300" inside nnUNet/nnunetv2/training/nnUNetTrainer/nnUNetTrainer.py The debug.json in the nnUNet_results folder does change the "num_epochs": "300" accordingly.

pongopal commented 1 year ago

Hi Louis, Unfortunately it didn’t work for me. I did the same as you said. Did you change anything in the program?

wjcheon commented 1 year ago

@ installation_path\nnunetv2\training\nnUNetTrainer\nnUNetTrainer.py @ line 143

self.num_epochs = 1000 -> 300

FabianIsensee commented 1 year ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

tzebre commented 1 year ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Hello, thanks for this help. I'm not sure where to call this class. Is it possible to have more indication on how to change the number of epoch please?

LouisDo2108 commented 1 year ago

Hello @tzebre,

You simply look around at that file to see if any trainer that meets your requirements, or you can create your own trainer class that inherits nnUNetTrainer. When you train nnUNet, simply put in your trainer name in the argument.

In your case, I believe this trainer is what you are looking for. You just need to paste in the trainer name "nnUNetTrainer_5epochs" to train nnUNet with 5 epochs.

class nnUNetTrainer_5epochs(nnUNetTrainer):
    def __init__(self, plans: dict, configuration: str, fold: int, dataset_json: dict, unpack_dataset: bool = True,
                 device: torch.device = torch.device('cuda')):
        """used for debugging plans etc"""
        super().__init__(plans, configuration, fold, dataset_json, unpack_dataset, device)
        self.num_epochs = 5
chenslcool commented 12 months ago

maybe config epochs/lr through arguments is better...

ancestor-mithril commented 12 months ago

It would be a good idea to create a dynamic trainer, with learning rate and epochs passed as a parameter. And that trainer would save it's training results in ${nnUNet_results}/${DATASET}/${trainer_name}_${train_epochs}_${initial_lr}_.... This would maintain the old behavior of nnUNet and also allow more flexible experiments, without needing to change code for each new learning rate.

toufiqmusah commented 5 months ago

Simply use the following

!nnUNetv2_train ID 2d/3d all -p nnUNetPlannerResEnc -tr nnUNetTrainer_100epochs

Can be 1, 5, 10, 25, 50, 100

from here

warmbasket commented 5 months ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

itachi1232gg commented 4 weeks ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

this is a stupid implementation, why not add an epoch argument in command

toufiqmusah commented 4 weeks ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

this is a stupid implementation, why not add an epoch argument in command

That's not a fair comment to make. You can be critical without being abrasive..

Also, there are set epoch numbers so not to disrupt the linear annealing learning rates. You can always create your own trainer class to subvert this if that doesn't fit your requirements, but no need to be abrasive.

warmbasket commented 4 weeks ago

It's a completely fair comment to make. You can accept criticism and reality without curling into a little wuss and calling criticism abrasive, grow up. The set epoch numbers do not disrupt the annealing learning rates, if you have an annealing learning rate assigned to a number of epochs, that's dumb. The training likely and evidently does conclude prior to the set number of epochs, you're just wasting cycles with how this is set up. There's zero common sense to it. Additionally, nnUnet is designed to be command promptable, (for no reason, it takes away from any education any user actually receives), so for you to then require manual intervention from people who haven't been told how to do anything but use the command prompt, is silly. Pick one or the other. Your comment only seeks to defend poor practices for the sake of your emotions, which no one cares about, and is not the purpose of this project.

warmbasket commented 4 weeks ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

this is a stupid implementation, why not add an epoch argument in command

100%, complete nonsense to call this an automated pipeline for training and attempt to reap benefits from the automated configuration when really it's misconfigured and hardly automated. the only new thing they did was create strong list of rule based parameters, and then shoved these into a ridiculous pre- and misconfigured pipeline for training rather than just provide users the method for choosing parameters for their model and training as a normal person would.

toufiqmusah commented 4 weeks ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

this is a stupid implementation, why not add an epoch argument in command

100%, complete nonsense to call this an automated pipeline for training and attempt to reap benefits from the automated configuration when really it's misconfigured and hardly automated. the only new thing they did was create strong list of rule based parameters, and then shoved these into a ridiculous pre- and misconfigured pipeline for training rather than just provide users the method for choosing parameters for their model and training as a normal person would.

A lot of entitlement coming from someone with barely 6 public 'contributions' in the last year.

If you have that much of a problem with the framework (which was primarily made as an out of the box segmentation tool, mind you), please go ahead and come up with your very own.

Thanks.

warmbasket commented 4 weeks ago

A lot of cluelessness coming from someone who is clearly an emotional child. Expected response,

Thank you, I've got one.

Peace whiner.

warmbasket commented 4 weeks ago

If you're so foolish to think 'contributions' on github are life and this is my main account....sheesh

warmbasket commented 4 weeks ago

DO NOT overwrite the standard trainer! Please create a new class and set the epochs there. See this: https://github.com/MIC-DKFZ/nnUNet/blob/master/nnunetv2/training/nnUNetTrainer/variants/training_length/nnUNetTrainer_Xepochs.py

Poor implementation

this is a stupid implementation, why not add an epoch argument in command

the only new thing they did was create strong list of rule based parameters, and then shoved these into a ridiculous pre- and misconfigured pipeline for training rather than just provide users the method for choosing parameters for their model and training as a normal person would.

If you have that much of a problem with the framework (which was primarily made as an out of the box segmentation tool, mind you), please go ahead and come up with your very own.

I just said what it was, why are you repeating it and further responding with nonsense?

FabianIsensee commented 3 weeks ago

Listen man. If you don't like it then don't use it. And please tamper your expectations. nnU-Net is not a product. You are not paying any money for it. There are no people dedicated to making it more easily usable (other than me and I have other things to do). This repo is 100x better maintained than other research code out there. We do out best to help people use it. Show some gratitude, and voice criticism in a constructive manner. Thanks!

warmbasket commented 3 weeks ago

Oh, hey you're here. We don't use it. The only people that use it are people that don't know what they're doing. That's the point.

nnU-Net is a product, it is a tool, literally what? what is it if not?

Yes, we know there are no people dedicated to making it more easily usable, clearly, this thread is a fantastic example. It's a shame it wasn't made properly usable in the first place. If you're so butthurt about what people are saying about what you made, shutup and get to work. How can you say this repo is 100x better maintained then other research code out there when you just said you don't maintain it and it's clearly not?

It's knowingly faulty, and other repositories are not lmfao, the base code is functionally useful, and you don't get people straining just to adjust the # of epochs. No real researcher uses or can use nnUnet for anything new, nor do you teach anyone anything with this product.

Please tamper what expectations? Expectations of not bullshit after publishing in Nature? Expectations of non-disingenuous production to abstract away learning and restrict use? Expectations of actually making your research generally functionally usable and useful in order for people to progress?

Criticism has been voiced in a constructive manner, it is a shame you're so lazy and disingenuous with the reality of your work. You talk like German for sure.

toufiqmusah commented 3 weeks ago

Listen man. If you don't like it then don't use it. And please tamper your expectations. nnU-Net is not a product. You are not paying any money for it. There are no people dedicated to making it more easily usable (other than me and I have other things to do).

This repo is 100x better maintained than other research code out there. We do out best to help people use it. Show some gratitude, and voice criticism in a constructive manner. Thanks!

Hello Fabian,

Best just ignore the instigation. Probably just an attention seeking troll.

We briefly spoke on the Tuesday at MICCAI, by the way (Chez Ali party). Keep up the great work.

warmbasket commented 3 weeks ago

Listen man. If you don't like it then don't use it. And please tamper your expectations. nnU-Net is not a product. You are not paying any money for it. There are no people dedicated to making it more easily usable (other than me and I have other things to do). This repo is 100x better maintained than other research code out there. We do out best to help people use it. Show some gratitude, and voice criticism in a constructive manner. Thanks!

Best just ignore the instigation. Probably just an attention seeking troll.

We briefly spoke on the Tuesday at MICCAI, by the way (Chez Ali party). Keep up the great work.

Certainly not instigation or a troll, it's reality and I'm sorry you're so butthurt. The updates at MICCAI were embarrassing.

warmbasket commented 3 weeks ago

Since you're all such emotional morons who can't stand critique of their work, here is literally your failure in your own words from the latest updates at MICCAI: "methods..." (your methods) not others/, others have superseded you yet you choose to deny this based on picking and choosing of datasets, "introduced in recent years fail to surpass the nnunet benchmark..." Yes because you guys created something that genuinely makes people more stupid and doesn't allow people to conduct actual research.

"This raises the question, how can we steer the field toward genuine progress?"

You fools realize that you've wasted colossal amounts of your own time and others, and ask yourselves how you can actually do anything progressive

And yet here you are being pricks over the fact that you don't even allow researchers to edit the number of epochs in a sensible way. Your project is a complete and entire failure, a setback due to how you presented it. And you realize that NOW. And you're still defensive children.

It is beyond embarrassing that you're foolish enough to cast aside 2+2=4 as "instigation" because you literally don't wanna hear it lmfao.

Leave Germany, people will tell you how good your work actually is, maybe you guys won't waste so much time as you've realized

warmbasket commented 3 weeks ago

@FabianIsensee @toufiqmusah

I'd never want to work with either of you based upon your complete inability to handle facts, and your exceptional ability to cast away common sense

warmbasket commented 3 weeks ago

@toufiqmusah Also answer my question, why respond with nonsense? This will be a good exercise for your mind lmfao

FabianIsensee commented 3 weeks ago

Riddle me this: If this repository is not useful and we apparently need to cherry pick dataset in order to make is seem like it's still relevant, then why is every segmentation challenge at MICCAI still won with people building on top of this repository?

There is no need to get this emotional. I agree that nnU-Net does not adhere to modern design standards. And I wish it would. Remember it was build in 2018 and is still using the same design principles. What you are asking for is a cosmetic rework that would require a lot of time by very skilled to be done correctly and without messing things up in the process. I don't have funding for this, and even if I had the question would be whether that is a good investment of our time (versus investing that in more research). After all, nnU-Net works just fine that way it currently is. Maybe not as convenient as it could be, but it works. Other frameworks may have nicer design principles, but won't give you the same result. You do you.

Please refrain from being impolite. If you have concrete suggestions that can realistically be materialized then we are all ears. If not, this will be my last response to your comments

Was great meeting you @toufiqmusah ! Didn't realize that was you :-)