PRBonn / phenobench-baselines

Baselines of the PhenoBench Dataset
https://www.phenobench.org
22 stars 4 forks source link

iou_per_class ERROR #4

Closed chenfh21 closed 8 months ago

chenfh21 commented 8 months ago

https://github.com/PRBonn/phenobench-baselines/blob/78db625441e54c5b64efabeb0d886d020961665b/semantic_segmentation/modules/module.py#L185 Hi: I confused this here. My understanding of this is the 'iou_per_class ' should pass in a set of calculated values, but 'iou_per_class ' in the previous step is calculated to be a tensor value.

JaWeyl commented 8 months ago

Hi Chenfh21,

We initialize the IoU evaluator as following self.metric_val_iou = torchmetrics.JaccardIndex(self.network.num_classes, reduction=None), where we set the reduction argument to None.

Thus, the line iou_per_class = self.metric_val_iou.compute() returns a Tensor with shape n_classes, i.e., [iou_background, iou_crop, iou_weed].

Consequently, the line for class_index, iou_class in enumerate(iou_per_class) iterates over the IoU score of each class.

Please note that torchmetrics replaced the reduction argument in the most recent version and now it is called average.

chenfh21 commented 8 months ago

Hi Chenfh21,

We initialize the IoU evaluator as following self.metric_val_iou = torchmetrics.JaccardIndex(self.network.num_classes, reduction=None), where we set the reduction argument to None.

Thus, the line iou_per_class = self.metric_val_iou.compute() returns a Tensor with shape n_classes, i.e., [iou_background, iou_crop, iou_weed].

Consequently, the line for class_index, iou_class in enumerate(iou_per_class) iterates over the IoU score of each class.

Please note that torchmetrics replaced the reduction argument in the most recent version and now it is called average.

Thank you very much,