Closed chenfh21 closed 8 months ago
Hi Chenfh21,
We initialize the IoU evaluator as following
self.metric_val_iou = torchmetrics.JaccardIndex(self.network.num_classes, reduction=None)
,
where we set the reduction
argument to None
.
Thus, the line iou_per_class = self.metric_val_iou.compute()
returns a Tensor with shape n_classes
, i.e., [iou_background, iou_crop, iou_weed].
Consequently, the line for class_index, iou_class in enumerate(iou_per_class)
iterates over the IoU score of each class.
Please note that torchmetrics replaced the reduction
argument in the most recent version and now it is called average
.
Hi Chenfh21,
We initialize the IoU evaluator as following
self.metric_val_iou = torchmetrics.JaccardIndex(self.network.num_classes, reduction=None)
, where we set thereduction
argument toNone
.Thus, the line
iou_per_class = self.metric_val_iou.compute()
returns a Tensor with shapen_classes
, i.e.,[iou_background, iou_crop, iou_weed].
Consequently, the line
for class_index, iou_class in enumerate(iou_per_class)
iterates over the IoU score of each class.Please note that torchmetrics replaced the
reduction
argument in the most recent version and now it is calledaverage
.
Thank you very much,
https://github.com/PRBonn/phenobench-baselines/blob/78db625441e54c5b64efabeb0d886d020961665b/semantic_segmentation/modules/module.py#L185 Hi: I confused this here. My understanding of this is the 'iou_per_class ' should pass in a set of calculated values, but 'iou_per_class ' in the previous step is calculated to be a tensor value.