tianweiy / MVP

MIT License
273 stars 38 forks source link

Visualization #36

Open Adam1904 opened 1 year ago

Adam1904 commented 1 year ago

thanks for great work. Ihave some questions:

1- How can I visualize real points and virtual points in BEV (Bird's Eye View) both with and without an image, similar to Figure 1 and Figure 3c and d? Could you please explain how du the figures visualized?

2- Could you please explain how Figure 4 was visualized?

3-and one more question please, in the paper:

For instance segmentation, we use CenterNet2 [73] which adds a cascade RoI heads [3] on top of the first stage proposal network. The overall network runs at 40 FPS and achieves 43.3 instance segmentation mAP on the nuScenes image dataset [2].

What hyperparameters did you use to get this result (43.3 instance segmentation mAP)? How much is the learning rate? How many iterations? How many images per batch? how many epochs?

tianweiy commented 1 year ago
  1. figure 1 uses open3d. and only virtual points inside bounding boxes are highlighted. the right side of figure 1 is zoom in (in the ui) and screenshots. for figure 3, a,b,c are illustration plots I drew in keynotes, d is also a zoom in + screenshot.

  2. figure 4 is just matplotlib or something similar. We collapsed the z-xis of all lidar points and draw the boxes.

  3. I don't remember the details now but I think this config is similar to what we use https://github.com/xingyizhou/CenterNet2/blob/master/configs/nuImages_CenterNet2_DLA_640_8x.yaml

Adam1904 commented 1 year ago

Is there any code for that, please? Figure 1 uses open3d, and only virtual points inside bounding boxes are highlighted. I tried using this code https://github.com/tianweiy/CenterPoint/blob/master/tools/visual.py, but it still doesn't work. I executed the following command:

python ./tools/visual.py --path ./dataa/nuScenes/samples/LIDAR_TOP_VIRTUAL/n008-2018-05-21-11-06-59-0400__LIDAR_TOP__1526915243047392.pcd.bin.pkl.npy

And I modified main in visual.py:

if __name__ == '__main__':
    parser = argparse.ArgumentParser(description="LIDAR_TOP_VIRTUAL")
    parser.add_argument('--path', help='path to visualization file', type=str)
    args = parser.parse_args()

    data = np.load(args.path, allow_pickle=True).item()
    virtual_points = data['virtual_points']

    pcd = o3d.geometry.PointCloud()
    pcd.points = o3d.utility.Vector3dVector(virtual_points[:, :3])

    o3d.visualization.draw_geometries([pcd])

However, it hangs and the result is not displayed. And how can i use detections and scores to plot boxes? any help? thank you for ur reply