Closed joewybean closed 3 weeks ago
For example,there are lidar2rgb_center, rgb_highres_center_intrinsic, rgb_highres_center2ego in the osdar23_converter.py , so how can we get lidar2ir_left, ir_left_intrinsic, ir_left2ego based on calibration.txt or other file? Many thanks!
`import numpy as np from scipy.spatial.transform import Rotation as R
lidar2ego = np.eye(4) lidar2ego[:3, 3] = [0, 0, 0] # Position of the lidar in ego coordinates lidar2ego[:3, :3] = R.from_quat([0, 0, 0, 1]).as_matrix() # Rotation matrix for lidar (identity)
rgb_left2ego = np.eye(4) rgb_left2ego[:3, 3] = [0.0529804, -0.104374, 3.50403] # Position of rgb_left in ego coordinates rgb_left2ego[:3, :3] = R.from_quat([0.203761, 0.0530686, -0.0108971, 0.977521]).as_matrix() # Rotation for rgb_left
lidar2rgb_left = np.linalg.inv(rgb_left2ego) @ lidar2ego
print("Calculated lidar2rgb_left:") print(lidar2rgb_left)`
Calculated lidar2rgb_left: [[ 9.94129959e-01 3.22333549e-04 -1.08192055e-01 3.26472448e-01] [ 4.29308727e-02 9.16725489e-01 3.97204378e-01 -1.29840825e+00] [ 9.93104471e-02 -3.99517552e-01 9.11330435e-01 -3.24028993e+00] [ 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00]]
The lidar2rgb_left in osdar23_converter.py is : lidar2rgb_left = np.asarray( [ [2.73611511e03, -3.90615474e03, 2.24407451e01, 6.00999552e02], [8.82958038e02, 3.26109605e02, -4.58687996e03, 9.34507256e03], [9.39295629e-01, 3.42440329e-01, 2.14089134e-02, -1.36006017e-01], ], dtype=np.float32, )
I am afraid I dont have those matrices since I only worked with lidar and images. You may want to contact OSDAR2023 authors.
Thank you so much for your kindly help!I will send email to OSDAR23 for help~
I have emailed them and am waiting for a reply. By the way, based on the calibration.txt file given in osdar23 can I get the matrices like lidar2rgb_left that you got in osdar23_converter.py? thanks!!!
![Uploading 微.png…]()
this notebook could be useful.
Yes! This is exactly what I want. May I ask if these parameters are calculated by ourselves based on the calibration.txt? Or is it just released by the public dataset OSDAR23 published somewhere? Because I haven't seen it in the public dataset. Thank you very much!
they might have added these matrices later on but I had to calculate them while I was working on this project.
Thank you very much for being able to answer my questions in your busy schedule, without your answers I might still be struggling with confusion. I visualized the feature heatmap for 2-modal fusion vs. 4-modal fusion, and we can see that your code frame outputs good results, while mine is not good due to the unaligned IR and radar images. Thanks again for your answer!
Thanks for your answer, I checked later and the problem is that these parameters are in the label.json file and not the calibration.txt, after the inspiration of your code, I have managed to get the IR image aligned on my side.
Hi, thank you very much for the code. I have a question about the OSDAR23 dataset. Now I want to extend this code to four modal fusion, so I need to combine IR images and Radar images. In the file “osdar23_converter.py”, there is only the transform matrix between RGB image and LIDAR, how can I get the transform matrix between IR image and LIDAR? Thanks!