lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
247 stars 27 forks source link

Too many measurements at collect data step #498

Closed sensorCalib closed 1 year ago

sensorCalib commented 2 years ago

Hi,

at step 4. when trying to collect data, we seem to get a dimension mismatch. This is the error occuring:

[ERROR] [1656064461.734798, 1656052507.736720]: bad callback: <bound method InteractiveDataLabeler.sensorDataReceivedCallback of <atom_calibration.collect.interactive_data_labeler.InteractiveDataLabeler object at 0x7f3929d15d60>> Traceback (most recent call last): File "/opt/ros/noetic/lib/python3/dist-packages/rospy/topics.py", line 750, in _invoke_callback cb(msg) File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 295, in sensorDataReceivedCallback self.labelData() # label the data File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 464, in labelData self.labels, seed_point, inliers = labelPointCloud2Msg(self.msg, x_marker, y_marker, z_marker, File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/label_messages.py", line 45, in labelPointCloud2Msg points[:, 0] = pc['x'].flatten() # flatten because some pcs are of shape (npoints,1) rather than (npoints,) ValueError: could not broadcast input array from shape (65536) into shape (64)

We have 64 lines with 1024 measurement-values each, which results in the 65536 measurements. However, only one measurment per coordinate is expected.

Any idea how to fix this? Thank you very much.

miguelriemoliveira commented 2 years ago

Hello,

Before we help, could you please provide more information? What robotic system are you calibrating? How many sensors? Of which modality?

It seems you have just created this account in github. Also, you do not identify yourself. If you could provide some more information that would be very helpful.

sensorCalib commented 2 years ago

Hello,

thanks for the feedback, I'll gladly provide you with more information.

Robotic system: ROS Noetic, working with noetic-devel branch How many sensors: two (ouster os1, zed 2i) The error happens with the ouster. Which modality: ouster os1 has modality lidar3d, zed 2i has modality rgb

I am a commercial user and working on a research project about environment detection.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

I have never worked with this sensor before, but it should be very similar to velodyne.

To investigate I would need your calibration.yml file (or a link to your calibration repository) and also a bag file with the sensor data. IT can be just a small 1 min bag or something.

Could you provide this please?

sensorCalib commented 2 years ago

Hello @miguelriemoliveira ,

yes, of course I can share these files with you. The bag files become quite huge quickly, so I only recorded several seconds. Hopefully, this is sufficient.

https://www.dropbox.com/scl/fo/o40e4s4vu32bpn99p1j2e/h?dl=0&rlkey=0ugua76pvhx1jsnxwbvanyz6p

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

I have downloaded the bag and yml. Thanks.

SI created a calibration package of my own since you did not share yours, but now to configure the calibration package I need this xacro file:

"/home/user/ws/src/base/description/urdf/system.urdf.xacro"

Can you provide it?

Thanks

miguelriemoliveira commented 2 years ago

I have compared the ouster lidar ros messages that you have

header: 
  seq: 1284
  stamp: 
    secs: 1656052510
    nsecs: 181796381
  frame_id: "ouster_os1_0_base_link"
height: 64
width: 1024
fields: "<array type: sensor_msgs/PointField, length: 9>"
is_bigendian: False
point_step: 48
row_step: 49152
data: "<array type: uint8, length: 3145728>"
is_dense: True

with the ones we use from velodynes:

header: 
  seq: 6319
  stamp: 
    secs: 1649240013
    nsecs: 829686165
  frame_id: "lidar_1"
height: 1
width: 26808
fields: "<array type: sensor_msgs/PointField, length: 6>"
is_bigendian: False
point_step: 22
row_step: 589776
data: "<array type: uint8, length: 589776>"
is_dense: True

I think the problem is, as you discussed above, the fact that the ouster point cloud is not flattened (i.e. height=1) as is the velodyne point cloud.

Let me investigate further.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

I changed the label messages code to be able to handle both velodynes and the ouster lidars.

In the standalone test I did it works:

https://github.com/lardemua/atom/blob/noetic-devel/atom_calibration/scripts/tests/test_ouster_lidar

Please test and let me know if it runs.

I suspect we will have additional problems with the ouster lidar in some other places, but without the xacro file I cannot test the complete pipeline.

sensorCalib commented 2 years ago

Hi @miguelriemoliveira ,

thank you very much for your help so far. I've retried the collect data step with the fix you added and I am getting past that bug mentioned above. As you suspected, there is an error further down the line:

[ERROR] [1656944209.650737, 1656052507.832743]: bad callback: <bound method InteractiveDataLabeler.sensorDataReceivedCallback of <atom_calibration.collect.interactive_data_labeler.InteractiveDataLabeler object at 0x7f7fb9fcfdf0>> Traceback (most recent call last): File "/opt/ros/noetic/lib/python3/dist-packages/rospy/topics.py", line 750, in _invoke_callback cb(msg) File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 295, in sensorDataReceivedCallback self.labelData() # label the data File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 464, in labelData self.labels, seed_point, inliers = labelPointCloud2Msg(self.msg, x_marker, y_marker, z_marker, File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/label_messages.py", line 156, in labelPointCloud2Msg x, y, z = pc['x'][idx], pc['y'][idx], pc['z'][idx] IndexError: index 64 is out of bounds for axis 0 with size 64

I have included an urdf file with the data from my xacro files, hope this helps: https://www.dropbox.com/scl/fo/o40e4s4vu32bpn99p1j2e/h?dl=0&rlkey=0ugua76pvhx1jsnxwbvanyz6p

sensorCalib commented 2 years ago

To get around that crash in line 156, I flattened the pc variable. So instead of x, y, z = pc['x'][idx], pc['y'][idx], pc['z'][idx]

I am using

x, y, z = pc['x'].flatten()[idx], pc['y'].flatten()[idx], pc['z'].flatten()[idx]

now. It's running without a crash for a while, but the outout dir defined in the command-line parameter only is an empty dir. Eventually, the data collection will fail again [ERROR] [1657628706.062701, 1656052506.920389]: bad callback: <bound method InteractiveDataLabeler.sensorDataReceivedCallback of <atom_calibration.collect.interactive_data_labeler.InteractiveDataLabeler object at 0x7fcd9e6ef6a0>> Traceback (most recent call last): File "/opt/ros/noetic/lib/python3/dist-packages/rospy/topics.py", line 750, in _invoke_callback cb(msg) File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 295, in sensorDataReceivedCallback self.labelData() # label the data File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/interactive_data_labeler.py", line 464, in labelData self.labels, seed_point, inliers = labelPointCloud2Msg(self.msg, x_marker, y_marker, z_marker, File "/home/user/ws/src/atom-calibration/atom_calibration/src/atom_calibration/collect/label_messages.py", line 133, in labelPointCloud2Msg distances = abs((A * pts[:, 0] + B * pts[:, 1] + C * pts[:, 2] + D)) / \ UnboundLocalError: local variable 'A' referenced before assignment

A will be used for RANSAC iterations, but it will only be initialized if "the number of inliers is larger than the previous max".

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

thank you for the urdf. However, the urdf calls for several stl files that you did not share.

Let me say that this is not my preferred way of working. You clearly created a github account just for creating this issue, which appears to have the clear intent of hiding your identity or that of your company, and now you are providing information drop by drop, which makes progress very difficult.

Why don't you share the repository? If its private, create a temporary access for me since you are asking for my help.

As you've seen before, I am willing to help, although I am taking a bit of time to respond. But if you want my help to make ATOM work in your system then we must have mutual trust.

sensorCalib commented 2 years ago

Hi @miguelriemoliveira , thank you very much again for you help and please excuse that way of communication. I indeed tried to not reveal more information than necessary, but you are absolutely right: if I am asking for your help, there should be some mutual trust. I have talked to my superior about this and got permission to share more information.

So, about my identity and the project I am working on: I am a developer from KRONE, an agricultural company from Germany. As a part of the Agri-Gaia project, we are working on several projects in the AI field, environmental detection being one of them: https://www.agri-gaia.de/

Please feel free to ask for more information if you like.

About the repository: our repositories can only be accessed from our companies network, so granting you access isn't done easily. Would it be alright to provide you our complete catkin-ws instead? This should contain all necessary files.

https://www.dropbox.com/s/bnqf0ldd2fzloxs/agri-gaia-catkin-ws.tar.gz?dl=0

./src/atom-result/calib_test1 contains the tried calibration, including the calib-file ./data/rosbags contains the bag-file with the pattern you already saw ./data/data-sets/calib_test1_out contains the (empty) output from the collect data step

Again, thank you very much for your patience.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

ok, thanks for the response. I will try to figure out what's going wrong and get back to you.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

so I was working on this today and was able to advance a bit.

I created a new private repo to which I invited you.

https://github.com/miguelriemoliveira/agri-gaia-test

Then fixed a few issues while trying to calibrate. The crash you were reporting was fixed and does not occur anymore.

However, to test, now I am gonna need a bigger bag file (30 secs at least), because only 4 secs is too few.

Some tips on doing this bagfile:

image

  1. You placed the chessboard on the ground. This way it will not be possible to detect the pattern in the lidar data. The pattern must be away from the ground, e.g. try to put it on top of a bench or a tripod.
  2. Can you provide a dae file of the chessboard? Or at least its physical dimensions (width, height) so I can do a chessboard digital model? Right now we are using a model for the charuco while using the chessboard, which does not make sense.

https://github.com/miguelriemoliveira/agri-gaia-test/blob/5d16d57106456d22a2fd656fdbfea53a3c77fff8/agri-gaia_calibration/calibration/config.yml#L126

Thanks, Miguel

sensorCalib commented 2 years ago

Hi @miguelriemoliveira

thank you very much for fixing the crashes. (And also for fixing the problem with the config file being None. I had noticed this bug occasionally, but I thought it was a mistake on my setup or something like that.)

  1. A longer bag file has been uploaded here: https://drive.google.com/drive/folders/1cufWx50qeXVnndhDZpJkdV8986wDW84x?usp=sharing The pattern ist standing slightly above the ground now. However, since the sensor is now mounted on top of the machine, the total distance might have been to long. Also, the pattern is just laying; would it be favoruable to move the pattern around during the recording? Or is that recording alright?

  2. I have pushed a dae file of the chessboard to the test repo, is this sufficient? In case it isn't, the measurments are Length: 8x10cm (square) + 2x5cm (border) = 90cm (total length) Width: 11x10cm (square) + 2x5cm (border) = 120cm (total width)

I have retried the collect_data step with the pushed pattern and the longer recording: there are no crashes so far, but the process won't end and no ATOM dataset will be created.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

thanks. I will try to pick this up on Friday and get back to you.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

The pattern isn't standing slightly above the ground now. However, since the sensor is now mounted on top of the machine, the total distance might have been to long.

Right, this makes it impossible to segment the pattern automatically. But I will try to use the manual segmentation and see if it works.

image

Also, the pattern is just laying; would it be favoruable to move the pattern around during the recording?

Yes, you have to move it around to get several collections. Not sure if you have done so, but read through the documenttion and check the ATOM youtube playlist for examples.

https://lardemua.github.io/atom_documentation/

https://www.youtube.com/watch?v=BYs1-H9vh0s&list=PLQN09mzV5mbI4h5IQt3Eu9kugSk-08mnY

If you have this questions having read the documentation then I should try to improve it.

Or is that recording alright?

No, but don't despair ... I am always saying to my students that no bagfile is ok until you've recorded 10 or 15 others before :-) Right now I think I can use it just to see if all the pipeline is working. Later you can try to calibrate with a better bagfile.

miguelriemoliveira commented 2 years ago

So I traced down the problem (one of them) to the fact that this OS lidar, unlike the velodyne we use, outputs dense point clouds.

I will try to adjust the data labeller to tackle these as well.

I cheked and about 50% of the points in the Ouster cloud are invalid

(65536, 3) all points (38079, 3) only valid points

Its probably because it is pointing to the sky? In any case it was confusing the labeller. I will try to fix.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

There was a lot of work to do as you can see above, but now I think the collector is working fine also for ouster lidars. You can see in #513 that the lidar auto detection is now working.

As we discussed, in your bag file the pattern is too close to the ground, so the detection will not be accurate.

Then I tried to manually edit the lidar labels using the dataset playback functionality

https://lardemua.github.io/atom_documentation/procedures/#dataset-playback

Right now I have another bug because also this script was not prepared to tackle dense point clouds (#520). Will work on this.

miguelriemoliveira commented 2 years ago

So I fixed #520 so now you can use dataset playback to manually label the chess.

You can try it out if you want. Don't forget to reconfigure your calibration package.

You should use the github I set up for agri gaia

https://github.com/miguelriemoliveira/agri-gaia-test

If you give me a better bagfile I can try to calibrate.

Some important tips for the bagfile:

  1. The bag should contain the pattern away from the ground, we use the tripod for this.

  2. Also, the pattern is too small / far away from the robot. It is not possible to detect the chess corners in the camera this way. You have to put the pattern closer or get a bigger one. For large robots like cars we normally use a 800x600 mm pattern like this:

    https://calib.io/collections/products/products/charuco-targets?variant=9400455004207

You can try the test detector script at

https://github.com/lardemua/atom/blob/noetic-devel/atom_calibration/scripts/deprecated/view_pattern.py

BTW, did you put the correct number of corners in the calibration.yaml?

  1. Finally, you should move the pattern to be able to save collections with the pattern in different places.
sensorCalib commented 2 years ago

Hi @miguelriemoliveira , thank you for these fixes.

Its probably because it is pointing to the sky? In any case it was confusing the labeller. I will try to fix.

The ouster os1 has a horizontal field of view of 360°, so yes, it is also pointing towards the sky. https://data.ouster.io/downloads/datasheets/datasheet-rev06-v2p3-os1.pdf I guess that explains, why so many points were invalid (?)

2. Also, the pattern is too small / far away from the robot. It is not possible to detect the chess corners in the camera this way.
   You have to put the pattern closer or get a bigger one. For large robots like cars we normally use a 800x600 mm pattern like this:

https://calib.io/collections/products/products/charuco-targets?variant=9400455004207

Our pattern is 1200x900mm^2 and quite big already, but obviously too far away. I will try to provide a recording in which the board is closer to the sensor.

BTW, did you put the correct number of corners in the calibration.yaml?

I might have made a mistake there: https://github.com/miguelriemoliveira/agri-gaia-test/blob/main/agri_gaia_calibration/calibration/config.yml#L140 The chessboard consists of 11x8 squares. This translates into 12 corners in x-direction and 9 corners in y-direction, right? (Instead of 11 and 8 as I previously wrote.)

You can try the test detector script at

https://github.com/lardemua/atom/blob/noetic-devel/atom_calibration/scripts/deprecated/view_pattern.py

I will give it a try, thank you.

As for #514 Apparently, I have written the wrong frame name for the sensor data. I will have a look at this.

As for the calibration you set up on github for agri-gaia: Could you explain, why you changed the child and parent links name for the zed camera? https://github.com/miguelriemoliveira/agri-gaia-test/blob/main/agri_gaia_calibration/calibration/config.yml#L82 https://github.com/miguelriemoliveira/agri-gaia-test/blob/main/agri_gaia_calibration/calibration/config.yml#L83

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

Could you explain, why you changed the child and parent links name for the zed camera?

As you had before ATOM should also be able to calibrate.

In any case, I changed just because of following the ROS convention. Note that ROS uses a x forward, y left, z up convention for the coordinate frames. The exception is the frames of camera sensors, that typically have a reference frame as per the ROS convention, and then have an optical frame that follows camera conventions, z forward, x right, y down.

So the transformation between the camera frame and the camera optical frame is just a matter of rotating to have the:

new z along the previous x new x as the previous -y new y as the previous -z

This is a standard transformation, and it does not make sense to change it. That's why I prefer to configure ATOM to change the upper level transformation instead.

image

Think of it as telling ATOM to calibrate the left_camera_frame w.r.t the camera_center, instead of telling ATOM to calibrate the left_camera_optical_frame w.r.t the left_camera_frame, which would mess up a standard transformation.

sensorCalib commented 2 years ago

Hi @miguelriemoliveira thanks for that explanation.

You can try the test detector script at

https://github.com/lardemua/atom/blob/noetic-devel/atom_calibration/scripts/deprecated/view_pattern.py

Could you explain how to use this tool? This script accepts the following arguments: [-t {charuco,chessboard}] [-d DICT] -x NUM_X -y NUM_Y -L LENGTH [-l INNER_LENGTH] topic In my case, I would call -t chessboard, -x 12 (since there are 12 edges in x direction) -y 9 (since there are 9 edges in y direction) -L 100 (since each edge has a length of 100mm ?) -topic (don't know which topic to use)

As for https://github.com/lardemua/atom/issues/514 Apparently, I have written the wrong frame name for the sensor data. I will have a look at this. Identified the problem, but not fixed yet.

Some important tips for the bagfile:

The bag should contain the pattern away from the ground, we use the tripod for this.

Also, the pattern is too small / far away from the robot. It is not possible to detect the chess corners in the camera this way. You have to put the pattern closer or get a bigger one.

A new bagfile in which the pattern is away from the ground, closer to the sensor and moving has been made. But as it seems, the data collect step still won't collect data. Neither for the zed nor for the ouster. As mentioned in the description https://lardemua.github.io/atom_documentation/procedures/#rgb-camera-labeling "You can check if the detection is working by observing the overlays of top of the images." the rgb-camera-labeling has automatic patttern detection, so I expected some blinking on the squares of the pattern from the zed camera or something like that, but I didn't notice any of that.

I then had a look again at the getting started and tried to setup everything again in a clean workspace. So after cloning the atom repo, I also tried to install the requirements.txt (something I didn't do before). This failed since ros_numpy apparently doesn't exist: ERROR: Could not find a version that satisfies the requirement ros_numpy (from -r requirements.txt (line 11)) (from versions: none) ERROR: No matching distribution found for ros_numpy (from -r requirements.txt (line 11)) So, I did not install ros_numpy and went on, but somehow, this was enough to mess up my setup and now the data collect step crashes again:

calib_crash.txt

I am still trying to figure out what went wrong. Could you try to calibrate again with the new bag file in the mean time? https://kronegroup.sharepoint.com:443/:f:/s/KRONECloud/Eu4ooUr6AtFDlK6X-V65sQEBChlCXgY9Qk9HZTHIgYP-6A?e=5%3aWyo1lC&at=9

Thank you very much.

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

Could you explain how to use this tool? This script accepts the following arguments: [-t {charuco,chessboard}] [-d DICT] -x NUM_X -y NUM_Y -L LENGTH [-l INNER_LENGTH] topic In my case, I would call -t chessboard, -x 12 (since there are 12 edges in x direction) -y 9 (since there are 9 edges in y direction) -L 100 (since each edge has a length of 100mm ?) -topic (don't know which topic to use)

About the question you did before and I forgot to answer, how many corners should we say, 11 or 12, I am never completely sure about this one. It says in the config.yaml file:

# The number of corners the pattern has in the X and Y dimensions.
  # Note: The charuco detector uses the number of squares per dimension in its detector.
  # Internally we add a +1 to Y and X dimensions to account for that.
  # Therefore, the number of corners should be used even for the charuco pattern.
  dimension: { "x": 9, "y": 6 }

So I would try first the number of corners (not the number of squares) in each direction.

-L 100 (since each edge has a length of 100mm ?)

Yes, the size of each square.

-topic (don't know which topic to use)

It is the topic in which your camera is being published. You can run a

rostopic list

to check all topics in the system. Also, you have defined it in the config.yaml:

https://github.com/miguelriemoliveira/agri-gaia-test/blob/8793e2d5a48b0c15e53b6a62cd3af2d408487d3d/agri_gaia_calibration/calibration/config.yml#L84

miguelriemoliveira commented 2 years ago

I then had a look again at the getting started and tried to setup everything again in a clean workspace. So after cloning the atom repo, I also tried to install the requirements.txt (something I didn't do before). This failed since ros_numpy apparently doesn't exist: ERROR: Could not find a version that satisfies the requirement ros_numpy (from -r requirements.txt (line 11)) (from versions: none) ERROR: No matching distribution found for ros_numpy (from -r requirements.txt (line 11)) So, I did not install ros_numpy and went on, but somehow, this was enough to mess up my setup and now the data collect step crashes again:

calib_crash.txt

Ok, please open a specific issue for this one and I will try to take a look.

You can download and test the https://github.com/miguelriemoliveira/agri-gaia-test . I did it so we can work together on the same repo.

All you have to do is to configure the calibration package for your machine, and it should function similarly to my side.

miguelriemoliveira commented 2 years ago

A new bagfile in which the pattern is away from the ground, closer to the sensor and moving has been made. But as it seems, the data collect step still won't collect data. Neither for the zed nor for the ouster.

Data collection is a manually triggered step. You have to press a green sphere which is somewhere in your 3d space in rviz and then select the "save collection" to save a collection.

As mentioned in the description https://lardemua.github.io/atom_documentation/procedures/#rgb-camera-labeling "You can check if the detection is working by observing the overlays of top of the images." the rgb-camera-labeling has automatic patttern detection, so I expected some blinking on the squares of the pattern from the zed camera or something like that, but I didn't notice any of that.

For this lets first try to run the view_frames script to make sure the configuration of the pattern is ok. I will try it and get back to you.

miguelriemoliveira commented 2 years ago

I am still trying to figure out what went wrong. Could you try to calibrate again with the new bag file in the mean time? https://kronegroup.sharepoint.com:443/:f:/s/KRONECloud/Eu4ooUr6AtFDlK6X-V65sQEBChlCXgY9Qk9HZTHIgYP-6A?e=5%3aWyo1lC&at=9

Sure, I will try it tomorrow and get back to you.

And thank you for being so patient with ATOM. Its really nice to have an outsider finding all sorts of bugs :-)

Perhaps, I get this to calibrate in your system and , we can arranje a zoom meeting so I can show you how it works.

miguelriemoliveira commented 2 years ago

oops, can't download the new bagfile. Not sure why, just get this message:

image

I went to the folder from a link I received from a Roman Weisgerber and I find there bag

recording_2022-07-26-14-53-46_0.bag

is this the one?

sensorCalib commented 2 years ago

Hi @sensorCalib ,

[...]

About the question you did before and I forgot to answer, how many corners should we say, 11 or 12, I am never completely sure about this one. It says in the config.yaml file:

# The number of corners the pattern has in the X and Y dimensions.
  # Note: The charuco detector uses the number of squares per dimension in its detector.
  # Internally we add a +1 to Y and X dimensions to account for that.
  # Therefore, the number of corners should be used even for the charuco pattern.
  dimension: { "x": 9, "y": 6 }

So I would try first the number of corners (not the number of squares) in each direction.

-L 100 (since each edge has a length of 100mm ?)

Yes, the size of each square.

-topic (don't know which topic to use)

It is the topic in which your camera is being published. You can run a

rostopic list

to check all topics in the system. Also, you have defined it in the config.yaml:

https://github.com/miguelriemoliveira/agri-gaia-test/blob/8793e2d5a48b0c15e53b6a62cd3af2d408487d3d/agri_gaia_calibration/calibration/config.yml#L84

Thanks. What should happen, if the configuration of the pattern is correct? And what if it isn't? I called the script, but didn't see any success message or something like that. As it seems, the configuration is incorrect(?)

view_pattern

The screenshot shows the output for "-x 12 -y 9", however it is the same for "-x 11 -y 8". (And it is also the same for "-x 9 -y 12" and "-x 8 -y 11", in case the axes were swapped somehow.)

miguelriemoliveira commented 2 years ago

It should draw something in the image I think ... but I will try it later today or tomorrow and get back to you ...

sensorCalib commented 2 years ago

oops, can't download the new bagfile. Not sure why, just get this message:

image

I went to the folder from a link I received from a Roman Weisgerber and I find there bag

recording_2022-07-26-14-53-46_0.bag

is this the one?

Yes, that is the latest bag file.

I am still trying to figure out what went wrong. Could you try to calibrate again with the new bag file in the mean time? https://kronegroup.sharepoint.com:443/:f:/s/KRONECloud/Eu4ooUr6AtFDlK6X-V65sQEBChlCXgY9Qk9HZTHIgYP-6A?e=5%3aWyo1lC&at=9

Sure, I will try it tomorrow and get back to you.

Thank you for having a look at it.

And thank you for being so patient with ATOM. Its really nice to have an outsider finding all sorts of bugs :-)

Perhaps, I get this to calibrate in your system and , we can arranje a zoom meeting so I can show you how it works.

No problem. Although, I can not deny that I would have preferred it to work immediately without having to find these bugs. :-)

miguelriemoliveira commented 2 years ago

Although, I can not deny that I would have preferred it to work immediately without having to find these bugs. :-)

Me too :- )

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

I just tried the view pattern script and the result was this:

image

The figure on the right is created by the script and shows the circles on top of the corners, signaling that the chessboard is being detected.

I ran the script like this:

rosrun atom_calibration view_pattern.py -t chessboard -x 10 -y 7 -L 0.1 /zed2i_0/rgb_raw/image_raw_color

So it turns out you only need to count the number of corners you see in the horizontal (10) and the vertical (7).

I will see if I can collect data now ...

miguelriemoliveira commented 2 years ago

I just successfully collected a dataset

https://youtu.be/7EEnSdLBdyk

The dataset is here:

test1.zip

Tomorrow I can try the calibration to see how it goes.

Also, I will try to help you with your bugs.

miguelriemoliveira commented 2 years ago

Just pushed to agri_gaia_test.

After git pull you should check the calibration.yaml, uncomment your paths and comment mine and reconfigure your calibration package.

miguelriemoliveira commented 2 years ago

I just tested the dataset playback and corrected the lidar automatic labels for two collections.

https://youtu.be/Xk56xqCWAj0

Here's the dataset also with the dataset_corrected.json

[Uploading test1.zip…]()

sensorCalib commented 2 years ago

I ran the script like this:

rosrun atom_calibration view_pattern.py -t chessboard -x 10 -y 7 -L 0.1 /zed2i_0/rgb_raw/image_raw_color

So it turns out you only need to count the number of corners you see in the horizontal (10) and the vertical (7).

I will see if I can collect data now ...

This worked for me too, thank you. Ok, understood which parameters to choose for x and y. (Also, the parameter for -L has to be passed in meters, not in milimeters as I previously did.)

I just successfully collected a dataset

https://youtu.be/7EEnSdLBdyk

The dataset is here:

test1.zip

Tomorrow I can try the calibration to see how it goes.

Also, I will try to help you with your bugs.

Great to see, that you were able to collect data. (It's still crashing for me as mentioned in https://github.com/miguelriemoliveira/agri-gaia-test/issues/1 )

miguelriemoliveira commented 2 years ago

https://github.com/miguelriemoliveira/agri-gaia-test/issues/1

Should be solved now. I will try to run a calibration ...

miguelriemoliveira commented 2 years ago

So to calibrate I need all collections correctly annotated, so I went back to the dataset playback and manually corrected the annotations, so we now have:

Collection 000 image

Collection 001 image

Collection 002 image

Collection 003 image

Collection 004 image

Here's the dataset. with the last dataset_corrected.json file

test1.zip

miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

so here's the first calibration of agri-gaia.

image

Results were the following:

Errors per collection (anchored sensor,  max error per sensor, not detected as "---")
+------------+------------+---------+
| Collection | ouster_os1 | zed2i_0 |
+------------+------------+---------+
|    000     |   0.0215   |  0.4476 |
|    001     |   0.0247   | 19.2592 |
|    002     |   0.0418   | 17.3578 |
|    003     |   0.0562   | 28.4024 |
|    004     |   0.0396   | 85.9234 |
|  Averages  |   0.0368   | 30.2781 |
+------------+------------+---------+
miguelriemoliveira commented 2 years ago

The visualization is not working well. I will check the pattern. I think its because of the position of the origin ... Created #526

The pixel errors of the zed are too high. I will investigate.

sensorCalib commented 2 years ago

Hi @miguelriemoliveira with the crash in https://github.com/miguelriemoliveira/agri-gaia-test/issues/1 being fixed, I was able to collect data as well. Thank you very much once again.

I then tried to calibrate as you did above, but ran into another error. This might have happened because I am not directly working on the machine and instead using a remote desktop software (x2go), but I havn't figured it out yet Or do you think this is something related to to ATOM?

rosrun atom_calibration calibrate -json /home/martelv/catkin_ws/data/ag_out/dataset.json -v -rv -si Traceback (most recent call last): File "/home/martelv/catkin_ws/devel/lib/atom_calibration/calibrate", line 15, in exec(compile(fh.read(), python_script, 'exec'), context) File "/home/martelv/catkin_ws/src/atom/atom_calibration/scripts/calibrate", line 23, in from atom_calibration.calibration.objective_function import errorReport, objectiveFunction File "/home/martelv/catkin_ws/src/atom/atom_calibration/src/atom_calibration/calibration/objective_function.py", line 27, in from atom_core.vision import projectToCamera File "/home/martelv/catkin_ws/src/atom/atom_core/src/atom_core/vision.py", line 14, in import open3d as o3d File "/home/martelv/.local/lib/python3.8/site-packages/open3d/init.py", line 56, in _CDLL(str(next((_Path(file).parent / 'cpu').glob('pybind*')))) File "/usr/lib/python3.8/ctypes/init.py", line 373, in init self._handle = _dlopen(self._name, mode) OSError: /home/martelv/.local/lib/python3.8/site-packages/open3d/cpu/pybind.cpython-38-x86_64-linux-gnu.so: undefined symbol: glGetQueryObjectui64v

miguelriemoliveira commented 2 years ago

I never had this before. What is your open3d version? Mine is 0.15.2 ...

image

miguelriemoliveira commented 2 years ago

Hi again @sensorCalib ,

here they say they had to compile from source ... What machine do you have? A standard ubuntu?

The remote desktop may have problems, I remember using anyviewer crashed sometimes with rviz. You could just use ssh -X to the machine, but visualization is slower ...

sensorCalib commented 2 years ago

Hi @miguelriemoliveira ,

"pip show open3d" told me, that open3d was installed. However, calling python3 and trying to "import open3d" as you did, resulted in an error.

So, I tried to install open3d from source as you suggested following this guide http://www.open3d.org/docs/release/compilation.html and created the python package with make python-package But I still got the same error when trying to "import open3d".

I then found out, that there is a difference between "pip" and "python3 -m pip" https://stackoverflow.com/questions/25749621/whats-the-difference-between-pip-install-and-python-m-pip-install so I retried installing the python-package using "python3 -m pip". This was successful and "import open3d" was not crashing anymore:

python3 Python 3.8.10 (default, Jun 22 2022, 20:18:18) [GCC 9.4.0] on linux Type "help", "copyright", "credits" or "license" for more information.

import open3d open3d.version '0.15.2+b40282c9' `

After that, I got another crash:

[ERROR] [1660134840.981366787]: PluginlibFactory: The plugin for class 'rviz_visual_tools/KeyTool' failed to load. Error: According to the loaded plugin descriptions the class rviz_visual_tools/KeyTool with base class type rviz::Tool does not exist. Declared types are rviz/FocusCamera rviz/Interact rviz/Measure rviz/MoveCamera rviz/PublishPoint rviz/Select rviz/SetGoal rviz/SetInitialPose rviz_plugin_selected_points_publisher/SelectedPointsPublisher rviz_plugin_tutorials/PlantFlag

Since I wanted to install open3d from source, I had to update cmake which in turn probably messed up my ROS-setup. Luckily, I was able to resolve that error through installing rviz visual tools again: sudo apt-get install ros-noetic-rviz-visual-tools

No further crashes since then :)

With that being out of the way, I retried the calibration process. So, I collected data again and went to the "Dataset playback" step, just as you did in the videos. roslaunch agri_gaia_calibration dataset_playback.launch clear && rosrun atom_calibration dataset_playback -json data/ag_out3/dataset.json -ow

However, I don't receive any images from rviz:

20220811_dataset_playback_rviz

Do you have an idea, why no frames were received?

dataset_playback is reacting to button presses and created a dataset_corrected.json, so it seems only the visualisation is buggy for me:

20220811_dataset_playback_terminal
miguelriemoliveira commented 2 years ago

Hi,

Glad to hear you advanced.

To visualize in rviz you should activate the visualization flags . Try with

-v -rv -si

https://lardemua.github.io/atom_documentation/procedures/#calibrate

sensorCalib commented 2 years ago

Hi,

I currently am at the "Dataset playback" step to correct the labels. The flags you mentioned arn't available here (?).

But since that step is optional, I can skip it and directly go to the "Calibrate sensors" step. So, now I am calling

roslaunch agri_gaia_calibration calibrate.launch rosrun atom_calibration calibrate -json /home/martelv/catkin_ws/data/ag_out3/dataset.json -v -rv -si

But there still is no visualisation:

20220811_calibrate_rviz 20220811_calibrate_terminal
miguelriemoliveira commented 2 years ago

Hi @sensorCalib ,

sorry, that's what happens when you try to answer during lunch ...

Let's try again:

Do you have an idea, why no frames were received? dataset_playback is reacting to button presses and created a dataset_corrected.json, so it seems only the visualisation is buggy for me:

Not sure why this happens. Can you confirm the messages are being sent? Just do a

rostopic echo /zed2i_0/rgb_raw/image_raw_color/labeled2d

Do you see any data being sent?

To me it works fine with my dataset. Can you try with this dataset to make sure its not the dataset?

test1.zip

Also, can you share the details of your dataset, i.e.

rosrun atom_calibration inspect_atom_dataset -json <your dataset.json>

About the problem with calibrate, it sounds like the same so lets first figure this one out.

sensorCalib commented 2 years ago

Hi,

sorry, that's what happens when you try to answer during lunch ...

no problem, thanks for helping even during lunch break.

Let's try again:

Do you have an idea, why no frames were received? dataset_playback is reacting to button presses and created a dataset_corrected.json, so it seems only the visualisation is buggy for me:

Not sure why this happens. Can you confirm the messages are being sent? Just do a

rostopic echo /zed2i_0/rgb_raw/image_raw_color/labeled2d

Do you see any data being sent?

To me it works fine with my dataset. Can you try with this dataset to make sure its not the dataset?

test1.zip

There were no messages being sent:

rostopic echo /zed2i_0/rgb_raw/image_raw_color/labeled2d WARNING: no messages received and simulated time is active. Is /clock being published?

However, that warning lead me to https://answers.ros.org/question/12083/messages-being-blocked-from-publishing/ which told me how to fix the problem.

I don't know how and when I set "use_sim_time" to true and why this was blocking my messages from being published, but it's working now.

Going to correct the labels now.

miguelriemoliveira commented 2 years ago

There were no messages being sent:

rostopic echo /zed2i_0/rgb_raw/image_raw_color/labeled2d WARNING: no messages received and simulated time is active. Is /clock being published?

However, that warning lead me to https://answers.ros.org/question/12083/messages-being-blocked-from-publishing/ which told me how to fix the problem.

I don't know how and when I set "use_sim_time" to true and why this was blocking my messages from being published, but it's working now.

That's strange.

When you run a bag file, its normal to set the use_sim_time parameter to true. Its in the automatically generated launch files:

https://github.com/miguelriemoliveira/agri-gaia-test/blob/4de14bfd4054c1d6c302dc97fc00e67f72500f23/agri_gaia_calibration/launch/playbag.launch#L67

In order for this to work some node should publish the clock (not the real clock, but the bag file clock). This is done with the --clock flag as in:

https://github.com/miguelriemoliveira/agri-gaia-test/blob/4de14bfd4054c1d6c302dc97fc00e67f72500f23/agri_gaia_calibration/launch/playbag.launch#L69-L76

So I am really lost at why you had that problem. The agri-gaia/launch/playbag.launch should set the parameter to true ... Are you using this launch file (using the dataset_playback.launch calls this one)?

miguelriemoliveira commented 2 years ago

Going to correct the labels now.

Don't forget that for that you need to:

  1. Compile the ATOM repo. There's a small c++ code that runs as an rviz plugin. Actually, this is the only ATOM c++ code.

    cd catkin_ws catkin_make

  2. Install a special fork of rviz as explained here

https://lardemua.github.io/atom_documentation/getting_started/#clone-rviz-fork