Open sandersson94 opened 3 years ago
Ah - it's irrelevant since the axis is the same length. Ignore this!
I wanted to share a concern with the function used for Evaluation Metric: compute_f1_score_at_tolerance
Could the evaluation metric unjustly reward over-prediction of trees?
Here is an example where I have a prediction with 38 pixels of trees, vs a ground truth of 12 pixels with trees. Here the metric shows 100% precision (12 tp and 0 fp), even though there is not a coregistration error, but instead a commission error. The commission error is not picked up by the modified Evaluation Metric.
I hope you consider this useful!
Best regards, Simon
Describe the bug Hello! Please forgive me if I am wrong!
In Evaluation metrics box [33] in
4-model.ipynb
max_y
should be reference totrue.shape[1]
rather thantrue.shape[0]
https://github.com/wri/sentinel-tree-cover/blob/master/notebooks/4-model.ipynb?short_path=af94e1f#L1269
https://github.com/wri/sentinel-tree-cover/blob/master/notebooks/4-model.ipynb?short_path=af94e1f#L1292