lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
253 stars 26 forks source link

unclear error during optimisation #850

Closed KankaJan closed 8 months ago

KankaJan commented 8 months ago

Hi. I'm testing ATOM for lidar-rgb camera calibration and during calibration procedure the error occured, that I'm not sure how to fix: ` File "/home/roboton/ws/devel/lib/atom_calibration/calibrate", line 15, in exec(compile(fh.read(), python_script, 'exec'), context) File "/home/roboton/ws/src/atom/atom_calibration/scripts/calibrate", line 699, in main() File "/home/roboton/ws/src/atom/atom_calibration/scripts/calibrate", line 652, in main opt.startOptimization(optimization_options=options) File "/home/roboton/ws/src/atom/atom_core/src/atom_core/optimization_utils.py", line 372, in startOptimization self.getNumberOfFunctionCallsPerIteration(optimization_options) File "/home/roboton/ws/src/atom/atom_core/src/atom_core/optimizationutils.py", line 412, in getNumberOfFunctionCallsPerIteration = least_squares(self.internalObjectiveFunction, self.x, verbose=0, jac_sparsity=self.sparse_matrix, File "/usr/local/lib/python3.8/dist-packages/scipy/optimize/_lsq/least_squares.py", line 830, in least_squares f0 = fun_wrapped(x0) File "/usr/local/lib/python3.8/dist-packages/scipy/optimize/_lsq/least_squares.py", line 825, in fun_wrapped return np.atleast_1d(fun(x, *args, *kwargs)) File "/home/roboton/ws/src/atom/atom_core/src/atom_core/optimization_utils.py", line 284, in internalObjectiveFunction self.vis_function_handle(self.data_models) # call visualization function File "/home/roboton/ws/src/atom/atom_calibration/src/atom_calibration/calibration/visualization.py", line 713, in visualizationFunction cv2.line(image, (x, y), (x, y), color, int(6E-3 diagonal)) cv2.error: OpenCV(4.9.0) :-1: error: (-5:Bad argument) in function 'line'

Overload resolution failed:

  • Can't parse 'pt1'. Sequence item with index 0 has a wrong type
  • Can't parse 'pt1'. Sequence item with index 0 has a wrong type `

What can be done to run optimisation procedure? Thank you, best wishes, Jan

miguelriemoliveira commented 8 months ago

Hi @KankaJan ,

I suspect it may have something to do with the opencv version. Its not the first time they have changed the format of the input arguments to the functions.

Can you try with opencv 4.6.0? That's the one I have ...

KankaJan commented 8 months ago

Unfortunatelly same error appears with opencv 4.6.0

miguelriemoliveira commented 8 months ago

Hi @KankaJan ,

can you please provide the necessary data so that I can try from my side?

I would need the package with the description file, a bagfile and a dataset and the calibrate command you are running ...

KankaJan commented 8 months ago

Hello @miguelriemoliveira I've send the files to mriem@ua.pt, hope it is all you need. Thank you

miguelriemoliveira commented 8 months ago

Hi @KankaJan ,

thanks for the data, but in the format you sent it would give me too much work to test.

There are no ros packages, I would have to create them myself.

If you want me to help its better to share a github repository where all is ready to run, and you tell me which command I should run to get the error.

Right now I do not have time to setup all that's needed to test your case.

miguelriemoliveira commented 8 months ago

I am closing this one for now as @KankaJan did not reply.

KankaJan commented 8 months ago

I am closing this one for now as @KankaJan did not reply.

Hi, I've send the requested data. Thanks Jan

miguelriemoliveira commented 8 months ago

Hi @KankaJan ,

Yes, I know. I replied. See above. You sent the data in a format I cannot use immediately, and would have too much work to setup.

Why don't you share your github repo instead? See the atom examples for what a robotic system should have and try to have those ros packages setup and share them with me.

KankaJan commented 8 months ago

@miguelriemoliveira I'do not have the repo public for NDA reasons. So I've sent you the copy of workspace with data and description today.

miguelriemoliveira commented 8 months ago

Hi @KankaJan ,

thanks. But you could share the repo only with me by making me a collaborator (as long as I was helping).

In any case, I think I found your (one of the) problem(s).

Your calibration config is not correct.

Look at the summary for the calibration:

image

The strange part is that you have the transform from the base_link to the camera_holder of type "dynamic". The camera is not moving with respect to the base link, and the lidar, right? So that transformation should be static (and calibrated).

The error is in the config.yml calibration_patterns part:

# The calibration requires at least one detectable pattern.
# This section describes the properties of the calibration pattern(s) used in the calibration.
calibration_patterns:

  pattern_1:
    # The frame id (or link) of the pattern.
    # This link/transformation will be optimized.
    link: "camera_holder"

    # The parent frame id (or link) of the pattern.
    # For example, in hand-eye calibration the parent link
    # of the pattern can be the end-effector or the base of the arm
    parent_link: "base_link"

    # Defines if the pattern link is the same in all collections (i.e. fixed=true),
    # or each collection will have its own estimate of the link transformation.
    # Note: if you plan to have the pattern fixed, while the moving the rigidly attached sensors,
    # this is equivalent to having the sensors fixed and the pattern moving, so you should use fixed=false.
    fixed: False

   # etc ...

The pattern link cannot be the same as the camera (the camera and the pattern are not rigid), and the parent of the pattern should be the world link in this case.

My suggestion is that you take a look at the atom examples we recently added.

https://github.com/lardemua/atom/tree/noetic-devel/atom_examples

Those should help you figure out how you can configure your calibration.

If you want, when you have a new config.yml, let me see it and I will let you know if it makes sense.

KankaJan commented 8 months ago

@miguelriemoliveira thank you for the suggestion. The setup, as posssible to see in the model, is fixed lidar and cmera on the rotable holder, so my guess was to have dynamic link to the camera_holder, to calibrate transformation of camera-lidar shift and rotation. Maybe I do understand the linking in the wrong way?

miguelriemoliveira commented 8 months ago

@miguelriemoliveira thank you for the suggestion. The setup, as posssible to see in the model, is fixed lidar and cmera on the rotable holder, so my guess was to have dynamic link to the camera_holder, to calibrate transformation of camera-lidar shift and rotation. Maybe I do understand the linking in the wrong way?

Perhaps. I am not sure. Do you have a photo of the system you can share?

KankaJan commented 8 months ago

P_20240318_153226

miguelriemoliveira commented 8 months ago

I think in this case the camera and the lidar are rigidly attached, i.e., there cannot exist any dynamic transformations between them.

KankaJan commented 8 months ago

so, how does it affect the solution?

miguelriemoliveira commented 8 months ago

so, how does it affect the solution?

Well, if your configuration of the calibration is incorrect the solution will be incorrect as well.

Use the atom examples as a guide to see how config,yml files are written.

KankaJan commented 8 months ago

well I did, bud none of the example fits my setup. could you recommend some?

miguelriemoliveira commented 8 months ago

hum, if the camera and the lidar do not move with respect to each other, and the pattern is moving around in the scene, that maps directly to the rlbot , does it not?

KankaJan commented 8 months ago

you're right, I'll check it out. I have not found this examples! thanks

upaltamentept commented 4 months ago

I'm having the same error, given that i was not getting if before in the same workspace, could it be derived from the tf's or orientation of the sensors?

This is my summary summaru