lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
252 stars 26 forks source link

`interactive_pattern` needs to be adapted to work with ODOM as reference frame #548

Open JorgeFernandes-Git opened 1 year ago

JorgeFernandes-Git commented 1 year ago

Te script interactive_pattern works fine when performing calibrations where the robot is linked to the world frame, but for calibrations where the main link is the odom the script didn't work for me.

The Rviz presented an error saying:

Cannot get tf info for init message with sequence number 1. Error: "world" passed to lookupTransform argument source_frame does not exist.

I tried to replace world for odom in the code, but the error persisted. I ended up positioned the pattern on gazebo. For my calibration it wasn't a big deal, because the pattern was static, I just had to position it in front of the sensors at the beginning. Even so, it may be needed in other scenarios.

manuelgitgomes commented 1 year ago

Hello @JorgeFernandes-Git,

When calibrating AtlasCar2, I used the odom frame and no problem arose. Having you changed the config.yml where the world frame is defined?

JorgeFernandes-Git commented 1 year ago

Hi @manuelgitgomes.

Yes I did, I used the odom for both the frame of reference and the parent link of the pattern. I followed this file:

https://github.com/lardemua/atlascar2/blob/master/atlascar2_calibration/calibration/config.yml

But you were able to change the pattern position by moving the marker on Rviz? I could see pattern, but the marker didn't work.

It's like they say "it works on my machine" 😅

manuelgitgomes commented 1 year ago

Honestly, I have no idea if it moved. I believe he stayed still during the whole calibration process.

But from the code you placed before, it seems the interactive pattern is dependent on the world frame. I suggest moving the interactive pattern to atom and to make it general, what do you think @miguelriemoliveira @JorgeFernandes-Git ?

JorgeFernandes-Git commented 1 year ago

You are right, I didn't move the pattern while recording the bagfile as well. And probably when using the odom as a reference frame, the pattern will almost always be static, I think.

But maybe there is some specific case where that pattern and the robot need to move?

I just happen to notice this behaver because I needed to reposition the pattern in a location where it could be seen by both sensors, and weren't able to do so in Rviz.

I suggest moving the interactive pattern to atom and to make it general

I think it's a great idea. Not only that, but I also talked to @miguelriemoliveira in automating the movements of the pattern to help while recording a bagfile. We can send random translations and rotations to the pattern within thresholds. In the case of cameras, we can use the detections of the corners of the pattern as feedback, to verify if the arrived position is good for recording a collection. If it is good the pattern stops, if not it moves to a new pose. Just an idea, I think it's feasible.