I am trying to calculate the precision and recall for tusimple as metrics, yet the results from evalution only provide fp, fn and accuracy, so I am wondering what is accuracy? Is this the true positive?
I think you can refer to tusimple-benchmark on github for the evaluation code. Then you will understand what is tp or fp or fn. So it will be easy for you to code the evaluation metric.
Hi Harry,
I am trying to calculate the precision and recall for tusimple as metrics, yet the results from evalution only provide fp, fn and accuracy, so I am wondering what is accuracy? Is this the true positive?