skyhehe123 / SA-SSD

SA-SSD: Structure Aware Single-stage 3D Object Detection from Point Cloud (CVPR 2020)
492 stars 106 forks source link

Unable to test on a very small dataset (ValueError: need at least one array to concatenate) #95

Open CharviVitthal opened 2 years ago

CharviVitthal commented 2 years ago

Hello, I want to test on a very small dataset. I reduced my validation dataset size to only 3 point clouds. (The same issue is also present if dataset contains only 1 or 2 point clouds). The folder has the required structure - labels, images, calib, velodyne, velodyne reduced. create_data.py works fine. But test.py gives the following error. When I increase my data size, everything works fine again. Traceback (most recent call last): File "test.py", line 209, in main() File "test.py", line 204, in main result = get_official_eval_result(gt_annos, outputs, current_classes=class_names) File "/home/charvi/SA-SSD/mmdet/core/evaluation/kitti_eval.py", line 839, in get_official_eval_result gt_annos, dt_annos, current_classes, min_overlaps, compute_aos, difficultys) File "/home/charvi/SA-SSD/mmdet/core/evaluation/kitti_eval.py", line 708, in do_eval_v2 min_overlaps, compute_aos) File "/home/charvi/SA-SSD/mmdet/core/evaluation/kitti_eval.py", line 586, in eval_class_v3 rets = calculate_iou_partly(dt_annos, gt_annos, metric, num_parts) File "/home/charvi/SA-SSD/mmdet/core/evaluation/kitti_eval.py", line 376, in calculate_iou_partly gt_boxes = np.concatenate([a["bbox"] for a in gt_annos_part], 0) File "<__array_function__ internals>", line 6, in concatenate ValueError: need at least one array to concatenate

Any help is appreciated. (In line 376 "gt_boxes = np.concatenate([a["bbox"] for a in gt_annos_part], 0)" gt_annos_part is coming an empty array)

Best regards, Charvi