Closed Aguin closed 1 year ago
Hi, @Aguin Thanks for your suggestions, and here are my responses to questions 1-4.
In the coming update, num_match_match
will only contains both_visible_indices
, i.e, excludes the case both_invisible_indices
.
We have excluded those -1
in line391-394
. At present, there is no negative x_error/z_error
in our evaluation results.
x_error_close_avg = np.average(laneline_x_error_close[laneline_x_error_close > -1 + 1e-6])
x_error_far_avg = np.average(laneline_x_error_far[laneline_x_error_far > -1 + 1e-6])
z_error_close_avg = np.average(laneline_z_error_close[laneline_z_error_close > -1 + 1e-6])
z_error_far_avg = np.average(laneline_z_error_far[laneline_z_error_far > -1 + 1e-6])
After loaded from our json file, the type of pred_lanes
is a list
. Each item in the list represents a single lane whose type is numpy.ndarray
and in shape (N, 3), where N represents the number of points of a lane. During our evaluation, no additional type conversion code needs to be added at this time.
We will further consider whether gt_visibility
should be introduced into the evaluation, and will communicate with you in time if there is any update in the future.
For issue 3 and 4, hope you can provide more information for further communication.
@zihanding819 I really appreciate that you have been quick to resolve these issues and I believe the evaluation metric will be more convincing after the first 2 issues have been fixed.
For issue 3, I think it's okay to let the users pass in lists of ndarrays or lists, as long as the format is stated in the comments. I raised this just because I saw np.array()
here https://github.com/OpenPerceptionX/OpenLane/blob/main/eval/LANE_evaluation/lane3d/eval_3D_lane.py#L98 and I thought it might be a little bit confusing.
Besides, I think some invisible points are actually noisy data because they look weird, although I'm not sure how they would affect the evaluation. You can easily find these weird frames by sampling some data and here are some examples.
@Aguin The visibility attribute is supposed to deal with this noises, just as shown in the mid figures. Invisible gt points will not affect evaluation.
Hi @ChonghaoSima @zihanding819 , I found some bugs in v1.1
both_invisible_indices
are counted innum_match_mat
, but https://github.com/OpenPerceptionX/OpenLane/blob/main/eval/LANE_evaluation/lane3d/eval_3D_lane.py#L236 the denominator only counts visible points.-1
s should be removed bofore computing avg.pred_lanes
should be converted to ndarray before https://github.com/OpenPerceptionX/OpenLane/blob/main/eval/LANE_evaluation/lane3d/eval_3D_lane.py#L9518