valeoai / carrada_dataset

GNU General Public License v3.0
85 stars 22 forks source link

Cannot run properly "run_annotation_pipeline.sh" #11

Closed chenhengwei1999 closed 1 year ago

chenhengwei1999 commented 2 years ago

hi, when I run "bash run_annotation_pipeline.sh", the following error occurs:

I haven't made any changes to the code and data set, What's going on? Looking forward to your reply, thank you in advance!

ArthurOuaknine commented 2 years ago

Hello. I am not fully available these days and I don't have a direct access to the dataset. I apologize for the future late replies or difficulties to answer.

As far as I understand the error, the script is working for the two first steps and then crash at the step 3, right?

Have you well followed the instructions in the README file? In particular when you have to set the path to the dataset while running the script. The code uses files which are in the dataset folder.

The first two steps should have generated files which are used for the third step. Have you checked if they are generated? If yes, do they correspond to the original files provided in the dataset?

I hope it will help. Let me know if you still have problems.

chenhengwei1999 commented 2 years ago

Hi, Author. Sorry for not replying in time, because I tested the whole process again today. I refer to the method without docker. First, through set_path.py sets the path of the dataset. Then I ran generateinstances.py and generate rd_ points.py and generate_centroids.py separately (in the order of run_annotation_pipeline.sh), but I still encountered the same error as above.

So I debugged the generate_centroids.py, the following is the line of code with index error during debugging (244 lines in centroid_tracking.py).

image

image

image

From the debugging results, we can see that world_ points.json file has been generated, but only few frames have data fields. Where should I start to troubleshoot this problem?

After downloading your code, which function parameters need to be modified? For example, the generate_instances.py:

image

Finally, I hope you can give me some tips, and I will be very grateful. Wish you a happy life!

ArthurOuaknine commented 2 years ago

Hello.

Sorry again for the late reply. It looks like the "world points" are not well generated. Meaning that the problem is in the script generating the wold_points.json file. These points correspond to the centroid of the segmented instance projected in the real world using the calibration of the camera. The problem is either on the segmentation or on the projection in the real world.

In both cases, please verify that you have the correct version of the required packages (see Docker file).

If the problem is related to the segmentation, you can see it by visualizing the results of the loaded Mask RCNN by fixing the "save_masks" parameter to True. This way, you can easily see if a point should be generated for an instance at a given timestamp.

If the segmentation is good. It can be related to the computation of the centroid of the shape and its projection on the ground. But these steps are easy, I don't think it comes from here.

Otherwise, the problem can come from the projection of the centroid to the real world. These step uses OpenCV, please be sure that you use the correct version. You have to verify that a world point is well generated for each centroid of the segmented instances.

I don't have a proper access to the code or the dataset now so I can go into the details. Please let me know if you need further help.