-
### Detailed Description
What can previous models recognise and how well?
What other classifications are there in the models?
Are there newer versions of the models in pytorch / ultralytic?
Should…
-
## 🚀 Feature
Interpreting LiDAR/PC-based model
## Motivation
I'm frustrated and tired of integrating [PointNet](https://github.com/yanx27/Pointnet_Pointnet2_pytorch) and [LIME-3D](https://githu…
-
In the dataset page, it said,
LIDAR:
2 x SICK LMS-151 2D LIDAR, 270° FoV, 50Hz, 50m range, 0.5° resolution
1 x SICK LD-MRS 3D LIDAR, 85° HFoV, 3.2° VFoV, 4 planes, 12.5Hz, 50m range, 0.125° resolut…
mzy97 updated
2 years ago
-
Hello, I am very interested in the WildScenes dataset. In the WildScenes2d folder, I found the camera_calibration.yaml file, which contains the extrinsic parameters between the LiDAR and the camera. S…
-
Thanks for your excellent work! It's amazing!
I have a question about how we can get the GT 3D point. Are those GT generated from the GT depth or from a depth sensor such as a LiDAR or RGB-D camera d…
-
-
Thank you for your great work,I wonder omnire whether could do this, it seems that you didn't mention it in project page,thank you a lot for you great work!
-
We present Wildcat, a novel online 3D lidar-inertial SLAM system with exceptional versatility and robustness. At its core, Wildcat combines a robust real-time lidar-inertial odometry module, utilising…
-
### Prerequisite
- [X] I have searched [Issues](https://github.com/open-mmlab/mmdetection3d/issues) and [Discussions](https://github.com/open-mmlab/mmdetection3d/discussions) but cannot get the expec…
-
In the code, the voxels are read directly from the file. The paper doesn't explain how it works