lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
246 stars 28 forks source link

Ideas for next agrob bagfile #157

Closed miguelriemoliveira closed 4 years ago

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

This issue is just to collect the ideas of what we can do better in the next bag file.

  1. longer time (4 or 5 mins)

  2. record also joint_states (input for the robot_state_publisher to publish tfs). Do you have some in the system?

  3. patten on landscape orientation

  4. fix the vlp16_frame / velodyne link

X. ... more stuff ?? Please add)

aaguiar96 commented 4 years ago

Also:

  1. Pattern closer to the camera
aaguiar96 commented 4 years ago

@miguelriemoliveira this week I'll try to have a first version of the optimization working properly.

I'll keep in touch! :)

miguelriemoliveira commented 4 years ago

Great! Let me know if you need help.

Regards, Miguel

On Mon, 25 May 2020 at 09:36, André Aguiar notifications@github.com wrote:

@miguelriemoliveira https://github.com/miguelriemoliveira this week I'll try to have a first version of the optimization working properly.

I'll keep in touch! :)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/AtlasCarCalibration/issues/157#issuecomment-633452883, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVUC4C5NDOXN34DOL43RTIUXVANCNFSM4NJB4E6Q .

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

I'm trying with uvc_camera but always getting this error:

pixfmt 0 = 'YUYV' desc = 'YUYV 4:2:2'                                                                 
  discrete: 2560x720:   1/60 1/30 1/15                                                               
  discrete: 1344x376:   1/100 1/60 1/30 1/15                                                          
  discrete: 3840x1080:   1/30 1/15                                                                    
  discrete: 4416x1242:   1/15                                                                         
terminate called after throwing an instance of 'std::runtime_error'                                   
  what():  pixel format unavailable                                                                   
Aborted (core dumped)  

Do you know any other stereo camera driver for ROS?

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

I searched around and found nothing. I think the only way is ti follow the instrucitons from the manufacturar https://www.stereolabs.com/blog/use-your-zed-camera-with-ros/

but as you said this implies installing the sdk. Is it very cumbersome?

aaguiar96 commented 4 years ago

I'm installing it on my computer... Trying to connect then the velodyne to my computer...

If I calibrate the camera, can we use a new calibration in a old dataset or do we have to record a new one? Can we use the old bagfile or do we need a new one?...

eupedrosa commented 4 years ago

We can use the old dataset, but you have to modify by hand the data_collected.json and replace the camera info with the new information.

aaguiar96 commented 4 years ago

Ok, I got the ZED running on my PC. I'm going to calibrate it now

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

as @eupedrosa says, we can use it with an old dataset with some hand copying and a bit of patience.

But again, the old bagfiles and datasets we have are not very good, so I see no point in trying to recover them ... might as well spend that time recording a new better bagfile and collecting a new better dataset.

aaguiar96 commented 4 years ago

Agrob went out of battery, and we do not have a charger here right now...

I'm trying to come up with a solution.

aaguiar96 commented 4 years ago

I'll come back again tomorrow to record the bag file.

A co-worker will bring the charger. In any case, today I was able to calibrate the ZED.

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

were you able to record the bagfile today?

aaguiar96 commented 4 years ago

Hi @miguelriemoliveira

I did record one. However, the data is really out of sync, the tegra is always shutting down, the agrob computer do not have a good enough GPU to run the zed_ros_wrapper... Filipe will help me tomorrow to solve all these problems, and hopefully we'll have a bag file! :)

miguelriemoliveira commented 4 years ago

Hi,

Ok. Tjats good to hear.

Let us know once you have a bag file.

Miguel

On Wed, Jun 17, 2020, 08:44 André Aguiar notifications@github.com wrote:

Hi @miguelriemoliveira https://github.com/miguelriemoliveira

I did record one. However, the data is really out of sync, the tegra is always shutting down, the agrob computer do not have a good enough GPU to run the zed_ros_wrapper... Filipe will help me tomorrow to solve all these problems, and hopefully we'll have a bag file! :)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/atom/issues/157#issuecomment-645210720, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVR3XZ6MO42PM5FTX3DRXBX45ANCNFSM4NJB4E6Q .

aaguiar96 commented 4 years ago

Let us know once you have a bag file.

Ok! I hope tomorrow I'll have it.

Btw guys, I checked and we were using raw image data. :)

miguelriemoliveira commented 4 years ago

Raw data is what we need!

On Wed, Jun 17, 2020, 15:58 André Aguiar notifications@github.com wrote:

Let us know once you have a bag file.

Ok! I hope tomorrow I'll have it.

Btw guys, I checked and we were using raw image data. :)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/lardemua/atom/issues/157#issuecomment-645427699, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWTHVRZ53JSAGI3JORC2HTRXDK2NANCNFSM4NJB4E6Q .

aaguiar96 commented 4 years ago

I @miguelriemoliveira, are you there?

I think I was able to synchronize both machines using chrony. I'll record now a bagfile now and post it here. Do you have time to test it out and tell me if it is ok? Because if not, I'm still in the lab and maybe I so I can correct the issues.

eupedrosa commented 4 years ago

Hi @aaguiar96. I already spoke with him today, so, he is around.

I can also teste the bagfile :) The more people test the code the better.

aaguiar96 commented 4 years ago

Nice @eupedrosa

I'll post it here, I'm uploading it

aaguiar96 commented 4 years ago

Here it is. Let me know what you think of it!

https://drive.google.com/file/d/1lXQbztSSLRaLkqvej67IcM7tG-QpTYZX/view?usp=sharing

eupedrosa commented 4 years ago

Got it.. Once I have something I will report.

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 and @eupedrosa ,

thanks for helping @eupedrosa . @aaguiar96 I can try to test but I am with the kids this afternoon so don't count on me :(. I will try.

Great that @eupedrosa can help.

aaguiar96 commented 4 years ago

The data displayed on rviz seems really synchronized... The timestamps have the secs equal, but the nsecs different:

/velodyne_points/header

seq: 1421
stamp: 
  secs: 1592485671
  nsecs: 680405000
frame_id: "velodyne"

/zed_nano/zed_node/left/image/compressed/header

seq: 1227
stamp: 
  secs: 1592485671
  nsecs: 297353320
frame_id: "zed_left_camera_optical_frame"
eupedrosa commented 4 years ago

@aaguiar96, you have missing dependencies in your agrob_description/package.xml. I usually do a clean install (inside docker), so I can catch these It works in my machine :tm:

Do you want some help in this?

aaguiar96 commented 4 years ago

That file is generated automatically right?...

aaguiar96 commented 4 years ago

Can you report the error? I can't meet right now...

eupedrosa commented 4 years ago

I already solved it, just add this to the agrob_description/package.xml

<exec_depend>hector_xacro_tools</exec_depend>
<exec_depend>hector_sensors_description</exec_depend>
<exec_depend>husky_description</exec_depend>

Do you want me to push it?

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

The data displayed on rviz seems really synchronized... The timestamps have the secs equal, but the nsecs different:

I could not download the bag yet. The synchronized timestamps is a definite improvement. How long is the bag file? How many (distinct) collections do you think you could get from it?

eupedrosa commented 4 years ago

@aaguiar96 The data collector is not working. It blocks when I trigger a a save.

aaguiar96 commented 4 years ago

Do you want me to push it?

Yes, please!

aaguiar96 commented 4 years ago

How long is the bag file? How many (distinct) collections do you think you could get from it?

5 minutes! I think it is possible to take 30/40 collections (just an estimate).

eupedrosa commented 4 years ago

@aaguiar96 The data collector is not working. It blocks when I trigger a a save.

I located where the problem is, but did not fixe it. Anyhow, I was able to get 31 collections and could have more if needed.

I will now test the calibration.

aaguiar96 commented 4 years ago

@aaguiar96 The data collector is not working. It blocks when I trigger a a save.

I did not test the data collector with the new version of the repo. With this error, how did you collect a dataset?

miguelriemoliveira commented 4 years ago

Hi @aaguiar96 ,

I had this error a couple of days ago with the ur10e system.

The problem was running the bagfile too fast (because the file was also about 2GB).

can you try adding bag_rate:=0.5 or lower to your command? Does it solve?

aaguiar96 commented 4 years ago

Hi @aaguiar96 ,

I had this error a couple of days ago with the ur10e system.

The problem was running the bagfile too fast (because the file was also about 2GB).

can you try adding bag_rate:=0.5 or lower to your command? Does it solve?

I @miguelriemoliveira

I had a problem with my computer in the last few days... I'm reinstalling all the stuff.

I'm almost done. I will test it soon and let you know.

eupedrosa commented 4 years ago

@aaguiar96 @miguelriemoliveira I am almost able to do the calibration. I did this with a clean install. Almost everything went smoothly, except the calibration part.

  1. Installing the dependencies was time consuming
  2. It is so slow .......................................... Try 30 collection, what a nightmare.
  3. To many outputs. it is outputting information that is irrelevant to me (the user)

The optimization has a huge problem. The objective function is called a bajilion of times to calculate the derivatives. There is no way around this, therefore it has to be optimized as much as possible.

miguelriemoliveira commented 4 years ago

Hi Eurico,

you are right. It has to be optimized. But first we need to make it work. It is full of debug for now.

You should first try with 5 or 6 collections just to see if it works.

Another thing: before we were optimizing using just the 4 corners of the pattern. I reverted that to use all the corners, which create as lot more residuals and makes the process run slow. Perhaps I can add a command line argument to use one or the other ...

miguelriemoliveira commented 4 years ago

Installing the dependencies was time consuming

How long did it take more or less? This is a one timer, so it is not so crucial...

eupedrosa commented 4 years ago

How long did it take more or less? This is a one timer, so it is not so crucial...

With 5 or 6 collections it is acceptable, 35 with camera and velodyne is more expensive...

results

Optimizing with the Velodyne the problem remains, it is not optimizing correctly. Cameras only is working, 35 collection with no problem. There must be something wrong with the objective function of the 3D laser.

aaguiar96 commented 4 years ago

Optimizing with the Velodyne the problem remains, it is not optimizing correctly. Cameras only is working, 35 collection with no problem. There must be something wrong with the objective function of the 3D laser.

Can you show the result @eupedrosa ? (maybe in #179)

eupedrosa commented 4 years ago

We can start with this Screenshot from 2020-06-18 19-44-57

The laser beams do not correspond with the full pattern..

aaguiar96 commented 4 years ago

Weird, it was not happening in the last dataset... Did you set the correct pattern dimensions on the config file?

aaguiar96 commented 4 years ago

It should be something like this:

#
#           █████╗ ████████╗ ██████╗ ███╗   ███╗
#          ██╔══██╗╚══██╔══╝██╔═══██╗████╗ ████║
#          ███████║   ██║   ██║   ██║██╔████╔██║
#          ██╔══██║   ██║   ██║   ██║██║╚██╔╝██║
#   __     ██║  ██║   ██║   ╚██████╔╝██║ ╚═╝ ██║    _
#  / _|    ╚═╝  ╚═╝   ╚═╝    ╚═════╝ ╚═╝     ╚═╝   | |
#  | |_ _ __ __ _ _ __ ___   _____      _____  _ __| | __
#  |  _| '__/ _` | '_ ` _ \ / _ \ \ /\ / / _ \| '__| |/ /
#  | | | | | (_| | | | | | |  __/\ V  V / (_) | |  |   <
#  |_| |_|  \__,_|_| |_| |_|\___| \_/\_/ \___/|_|  |_|\_\
#  https://github.com/lardemua/atom

# This yaml file describes your calibration!

# You can start by defining your robotic system.
# This is the URDF file (or xacro) that describes your robot.
# Every time a path to a file is requested you can use
#
#   - Absolute Path
#       Example 1: /home/user/ros_workspace/your_package/urdf/description.urdf.xacro
#       Example 2: file://home/user/ros_workspace/your_package/urdf/description.urdf.xacro
#
#   - Path Expansion
#       Example 1: ${HOME}/user/${YOUR_VARIABLE}/your_package/urdf/description.urdf.xacro
#       Example 2: ~/user/ros_workspace/your_package/urdf/description.urdf.xacro
#
#       NOTE: It is up to you to guarantee the environment variable exists.
#
#   - ROS Package Reference
#       Example: package://your_package/urdf/description.urdf.xacro
#
description_file: "package://agrob_description/urdf/agrob.urdf.xacro"

# The calibration framework requires a bagfile to extract the necessary data for the calibration.
bag_file: "$ROS_BAGS/agrob_third_test.bag"

# You must define a frame of reference for the optimization process.
# It must exist in the transformation chains of all the sensors which are being calibrated.
world_link: "base_link"

# ATOM will calibrate the extrinsic parameters of your sensors.
# In this section you should discriminate the sensors that will be part of the calibrations.
sensors:
    # Each key will define a sensor and its name, which will be use throughout the calibration.
    # Each sensor definition must have the following properties:
    #       link:
    #           The frame of the sensor's data (i.e. the header.frame_id).
    #
    #       parent_link:
    #           The parent link of the transformation (i.e. link) to be calibrated.
    #
    #       child_link:
    #           This is the transformation (i.e. link) that we be optimized.
    #
    #       topic_name:
    #           Name of the ROS topic that contains the data produced by this sensor.
    #           If you are calibrating an camera, you should use the raw image produced by the
    #           sensors. Aditionally, it the topic is an image it will automatically use the
    #           respective `camera_info` topic.
    # Example:
    right_camera: 
          link : "zed_right_camera_optical_frame"
          parent_link : "zed_camera_center"
          child_link : "zed_right_camera_frame"
          topic_name : "/zed_nano/zed_node/right/image"

    left_camera: 
          link": zed_left_camera_optical_frame"
          parent_link: "zed_camera_center"
          child_link : "zed_left_camera_frame"
          topic_name : "/zed_nano/zed_node/left/image"

    vlp16: 
         link: "velodyne"
         parent_link: "tower_link"
         child_link: "vlp16_frame"
         topic_name: "/velodyne_points"

# The calibration requires a detectable pattern.
# This section describes the properties of the calibration pattern used in th calibration.
calibration_pattern:

    # The frame id (or link) of the pattern.
    # This link/transformation will be optimized.
    link: "chessboard_link"

    # The parent frame id (or link) of the pattern.
    # For example, in hand-eye calibration the parent link
    # of the pattern can be the end-effector or the base of the arm
    parent_link: "base_link"

    # Defines if the pattern link is the same in all collections (i.e. fixed=true),
    # or each collection will have its own estimative of the link transformation.
    fixed: false

    # The type of pattern used for the calibration.
    # Supported pattern are:
    # - chessboard
    # - charuco
    pattern_type: "charuco"

    # If the pattern type is "charuco" you need to define
    # the aruco dictionary used by the pattern.
    # See https://docs.opencv.org/trunk/dc/df7/dictionary_8hpp.html
    dictionary: "DICT_5X5_100"

    # Mesh file (collada.dae or stl) for showing pattern on rviz. URI or regular path.
    # See: description_file
    mesh_file: "package://atom_calibration/meshes/charuco_5X5_800x600.dae"

    # The border width from the edge corner to the pattern physical edge.
    # Used for 3D sensors and lidars.
    # It can be a scalar (same border in x and y directions), or it can be {'x': ..., 'y': ,,,}
    border_size: {'x': 0.04, 'y': 0.03}

    # The number of corners the pattern has in the X and Y dimensions.
    # Note: The charuco detector uses the number of squares per dimension in its detector.
    # Internally we add a +1 to Y and X dimensions to account for that.
    # Therfore, the number of corners should be used even for the charuco pattern.
    dimension: {"x": 11, "y": 8}

    # The length of the square edge.
    size: 0.06

    # The length of the charuco inner marker.
    inner_size: 0.045

# Miscellaneous configuration

# If your calibration problem is not fully constrained you should anchored one of the sensors.
# This makes it immovable during the optimization.
# This is tyically referred to as gauge freedom.
anchored_sensor: "right_camera"

# Max time delta (in milliseconds) between sensor data messages when creating a collection.
max_duration_between_msgs: 1000
eupedrosa commented 4 years ago
        # The border width from the edge corner to the pattern physical edge.
        # This is used for 3D sensors and lidars.
        "border_size": {'x': 0.04, 'y': 0.03},
        # The number of corners the pattern has in the X and Y dimentions.
        # Note: The charuco detector uses the number of squares per dimension in
        # its detector. Internally we add a +1 to Y and X dimensions to account for
        # that. Therfore, the number of corners should be used even for the charuco
        # pattern.
        "dimension": {"x": 11, "y": 8},
        # The length of the a square edge.
        "size": 0.06,
        # The length of the charuco inner marker.
        "inner_size": 0.045

This is what I have. It is working, the pattern detection is working correctly.

aaguiar96 commented 4 years ago

Ok, thanks @eupedrosa

I will try to fix it! :)

aaguiar96 commented 4 years ago

I'm stuck with this error while running the data collector...

import cv2
ImportError: libhdf5.so.101: cannot open shared object file: No such file or directory

@eupedrosa did you got it too?

eupedrosa commented 4 years ago

No I did not get that error. That is a missing library, libhdf5 .. use apt to install..

miguelriemoliveira commented 4 years ago

@aaguiar96 , can you share a dataset of agrob once you have one? You can now record a dataset, right? Or do you still have some problems?

I recommend setting the parameter

# Max time delta (in milliseconds) between sensor data messages when creating a collection.
max_duration_between_msgs: 1000

to some value from 0.3 to 1 or something.

aaguiar96 commented 4 years ago

So, with the problems stated in #179 I cannot save a single valid collection from the entire bagfile. Tomorrow I'll record a new bagfile, with the following ideas in mind:

Any other ideas @miguelriemoliveira and @eupedrosa ?

eupedrosa commented 4 years ago

Be carefull with the to far from the sensors ... Too far and the corner detection will fail.