MIC-DKFZ / nnUNet

Apache License 2.0
5.9k stars 1.76k forks source link

Random dropout of lablels #2028

Closed Merom99 closed 5 months ago

Merom99 commented 8 months ago

Hi, I am using nnunetv1 and I have have data with different number of lables, I would like to randomly dropout specific labels i.e. label=2. I tried to do that on nnUNetTrainerV2 by adding dropout labels code under run_iteration function, however, it does return this error: "RuntimeError: One or more background workers are no longer alive. Exiting. Please check the print statements above for the actual error message Exception in thread Thread-4 (results_loop): Traceback (most recent call last):" Any idea of how to solve that?

Here is my addition:

def run_iteration(self, data_generator, do_backprop=True, run_online_evaluation=False): """ gradient clipping improves training stability

    :param data_generator:
    :param do_backprop:
    :param run_online_evaluation:
    :return:
    """
    data_dict = next(data_generator)
    data = data_dict['data']
    target = data_dict['target']

    data = maybe_to_torch(data)
    target = maybe_to_torch(target)

    if torch.cuda.is_available():
        data = to_cuda(data)
        target = to_cuda(target)

dropout

    **if self.prob > 0.0:
        for label in target:
            if np.random.rand() < self.prob:
                label[label == random.choice([2])] = 0
                print('label2 is omitted')**
HouXiao01 commented 7 months ago

I met the same problem too!I can't figure out it either,I am trying to use different environment configurations to solve it!

Merom99 commented 6 months ago

Thanks @HouXiao01, did you get any updates?

mrokuss commented 6 months ago

Hey @Merom99

Thank you for your question. First of all I would suggest to migrate to nnUNetV2 since V1 is no longer supported. The error message you are seeing is a standard error which happens during multiprocessing if one of the workers (for whatever reason) crashed. Usually the actual error message can be found above that. Furthermore, if you want to train nnUNet with a different amount of labels I would suggest to take a look at MultiTalent

Best,

Max

mrokuss commented 5 months ago

Closing. Feel free to reopen if you still have questions!