Closed Dumb6 closed 2 months ago
Hi, thanks four your interest in our work. Could you provide some more details about what you do? Did you follow the steps in the readme to prepare your own dataset? At which resolution are you running your lidar (i.e 2048, 1024 or 512)?
Hi there,
Same issue here, trying to run with a OS1-32. Running in 1024x10, 32 scan_line.
I followed the calibration process, resulting on a -24 image shift. I also compared the metadata files with those from the dataset you provide. Any major changes except the number of line obviously. I had the error of ROI as mentioned in a closed issues and caused by the mask in the param.yaml file. So I remove it for now. Finally, I set the mapping.launch file as mentionned in the .md and get the same error as Dumb6 when my bag starts running.
I wonder if the 32 lines are enough, or the 64 as for Dumb6.
Congrats again on your amazing work !
Hi @baptisteLynx, If you can provide an example rosbag and the metadata file, I can have a look what's going on. Best, Patrick
Hi @patripfr, Here is a link for a wetransfert with the bag and the json file ! Do you think the 32 lines config should the work normaly ?
I hope you'll find out !
Best, Baptiste
Thanks for providing the files.
I noticed that the code crashed as it wasn't expecting a negative column_shift
parameter (-24 in your case). I fixed this in the newest commit.
However, I noticed that when I run the code using the parameter you proposed, the projection is incorrect. When I run the calibration using the provided bag, I get a value of 500 (which seems to give a correct projection when running the code). Did you change something in the calibration file?
Additionally I noticed that your point clouds are recorded in the os_lidar
frame, different to the os_sensor
frame that is typically used by the ouster driver. This causes some issues, as my code explicitly compensates for the vertical offset of the os_sensor
frame. To fix this, you can set line 219 in the metadata file to 0 (instead of 36.18). Additionally, you need to adjust the extrinsics between lidar and imu.
For the 32 beam setup I would also propose to use the following settings in the params.yaml:
image:
patch_size: 3
max_range: 30
max_lifetime: 25
min_range: 0.7
suppression_radius: 3
num_features: 60
grad_min: 16.5
ncc_threshold: 0.7075
margin: 5
intensity_scale: 0.25
window: [21, 5]
masks: []
Cheers, Patrick
It works !!
Quite right, I did the calibration with my lidar in 204810 and not 102410. I changed the format when reading the issues but forgot to recalibrate!
I'll change the pointcloud frame back to os_sensor as it was changed for other tests.
Thank you very much for your help, I'll continue my tests and keep you posted!
Hi! I found my own problem. My ouster radar has a resolution of 2048. I forgot to change the parameters, but now I have another problem. When I debugged your code alone, I found that the intensity map I generated was very dark. What is the reason? In addition, I found that filterBrightness did not seem to improve the brightness.
Hi,
You can increase the brightness of the intensity map by changing the intensity_scale
. However this should not be necessary when using the brightness filter. I'm pretty sure that you changed something in the code, or used the wrong setting in the above picture. Attached you can see what the image should look like using the default code and settings:
Thank you very much for your quick reply! Yes, I studied your code step by step to understand it. I have deleted all other codes and only did the intensity map projection, but the result is as shown above. I have to increase the intensity_scale parameter from 0.25 to 2 to get the same result as before, but I can't find the problem.
In addition, even if I adjust intensity_scale, I can't get the previous intensity graph, but I didn't see any other processing of the intensity graph in your code later. I want to know what the problem is
I'm not sure if I understand what you are trying to achieve. The raw intensity values of the ouster are in a range that is larger than what can be visualized in a regular gray scale image (more information here). Therefore the intensity scalar can be used to change the visualizable range. The brightness filter (which I think is missing in your images) is applied here.
I totally agree with what you said, but I did use filterBrightness, but the image is really dark. I don't understand the problem. This is all my code and the information printed in my terminal.
In your modified code the brightness filter is not effective because you are not writing the filtered image back into the input image. See: https://github.com/ethz-asl/COIN-LIO/blob/main/src/image_processing.cpp#L133
Thank you very much!!!!!! Your work is really great. I have new insights every day. I want to know that when generating a projection image, it is equivalent to projecting the cylindrical point cloud one by one with pixel points. I want to know how to solve such a Jacobian matrix?
Good to hear you figured it out :) I'm not sure what jacobian you are talking about specifically, but most derivations can be found in the paper. If you have further questions feel free to send me an email.
Hi, thanks for your inspiring work and opensource code. What I want to ask is that I get good results when using the official dataset, but when I use the dataset I also collected with OS1-128, the program crashes directly. I want to ask if I need to modify anything when using my own dataset for the same device?