Open ohcccc opened 2 months ago
in addition,there are bug in function RobustCrossEntropyLoss(). """ 发生异常: UnboundLocalError (note: full exception trace is shown but execution is paused at: _run_module_as_main) cannot access local variable 'temp_after_DisMap' where it is not associated with a value File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/crossentropy.py", line 53, in forward return temp_after_DisMap ^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl return self._call_impl(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/dice_loss.py", line 378, in forward
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/deep_supervision.py", line 29, in forward
l = weights[0] self.loss(x[0], y[0],disMap[0],epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainerV2.py", line 252, in run_iteration
l = self.loss(output, target, disMap, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/network_trainer.py", line 462, in run_training
l = self.run_iteration(self.tr_gen, True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainer.py", line 321, in run_training
super(nnUNetTrainer, self).run_training()
File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainerV2.py", line 446, in run_training
ret = super().run_training()
^^^^^^^^^^^^^^^^^^^^^^
File "/home/icml/code/FracSegNet/code/Training/fracSegNet/run/run_training.py", line 179, in main
trainer.run_training()
File "/home/icml/code/FracSegNet/code/Training/fracSegNet/run/run_training.py", line 199, in
disMap2onehot
Hello, thank you for running the FracSegNet code. I didn't encounter the issue you mentioned while running the code. Could you please provide the shape or size of the variable disMap2onehot
so that I can help identify the problem?
in addition,there are bug in function RobustCrossEntropyLoss(). """ 发生异常: UnboundLocalError (note: full exception trace is shown but execution is paused at: _run_module_as_main) cannot access local variable 'temp_after_DisMap' where it is not associated with a value File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/crossentropy.py", line 53, in forward return temp_after_DisMap ^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl return forward_call(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl return self._call_impl(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/dice_loss.py", line 378, in forward # ce_loss = self.ce(net_output, target[:, 0].long(),disMap_weight[:,0],epoch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl return self._call_impl(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/loss_functions/deep_supervision.py", line 29, in forward l = weights[0] self.loss(x[0], y[0],disMap[0],epoch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainerV2.py", line 252, in run_iteration l = self.loss(output, target, disMap, self.epoch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/network_trainer.py", line 462, in run_training l = self.run_iteration(self.tr_gen, True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainer.py", line 321, in run_training super(nnUNetTrainer, self).run_training() File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/site-packages/nnunet/training/network_training/nnUNetTrainerV2.py", line 446, in run_training ret = super().run_training() ^^^^^^^^^^^^^^^^^^^^^^ File "/home/icml/code/FracSegNet/code/Training/fracSegNet/run/run_training.py", line 179, in main trainer.run_training() File "/home/icml/code/FracSegNet/code/Training/fracSegNet/run/run_training.py", line 199, in main() File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/runpy.py", line 88, in _run_code exec(code, run_globals) File "/home/icml/miniconda3/envs/nnunet/lib/python3.12/runpy.py", line 198, in _run_module_as_main (Current frame) return _run_code(code, main_globals, None, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ UnboundLocalError: cannot access local variable 'temp_after_DisMap' where it is not associated with a value """
Thank you for bringing up this issue. When modifying the RobustCrossEntropyLoss function, I changed the reduction
parameter in nn.CrossEntropyLoss
to none
for the calculation of the loss. You can find and modify this parameter in torch.nn.modules.loss.py
.
Your issue might be caused by the mentioned reason, and I apologize for any inconvenience.
disMap2onehot
Hello, thank you for running the FracSegNet code. I didn't encounter the issue you mentioned while running the code. Could you please provide the shape or size of the variable
disMap2onehot
so that I can help identify the problem?
thank you for your reply. disMap2onehot shape is (8,4,218,218,218), tp shape is (8,4,128,128,128).i don`t konw where the mistake lies.
@YzzLiu
Hi, thanks to your great work.I'm having some issues running your code. Can you give me some help?@YzzLiu in function get_tp_fp_fn_tn() ‘’‘ def get_tp_fp_fn_tn(net_output, gt, disMap = None, axes=None, mask=None, square=False ,current_epoch = None): smooth_trans = True Tauo_st = 0 st_epoch = 1000
‘’‘ tp = torch.mul(tp, disMap2onehot) there is a bug:The size of tensor a (128) must match the size of tensor b (218) at non-singleton dimension 4. i add ' disMap2onehot = torch.nn.functional.interpolate(disMap2onehot, size=tp.shape[2:])' I don't know if it's the right thing to do.