HViktorTsoi / ACSC

Automatic Calibration for Non-repetitive Scanning Solid-State LiDAR and Camera Systems
GNU General Public License v3.0
309 stars 61 forks source link

ValueError: vector::_M_default_append on segmentation_ext.region_growing_kernel #15

Closed VisionaryMind closed 3 years ago

VisionaryMind commented 3 years ago

We have been using this tool on multiple projects, and it has been working splendidly. Recently, we switched to new laptops that have Ubuntu 20.04 and Python 3.8. I have gotten all libraries and am able to import them into the Python interpreter, however, there seems to be an issue, either with the new libraries or with the size of our dataset (we are now using 6K images for calibration).

Everything works as usual up until the point in the calibration script (calibration.py) where the "pc" variable is assigned to utils.voxelize(pc, voxel_size=configs['calibration']['RG_VOXEL']). This downsamples a 1,173,359 point cloud to 90,052 points. As soon as segmentation_ext.region_growing_kernel is run, calibration.py spawns 5 additional threads, and the following error is immediately thrown:

Calculating frame: 0 / 22
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/home/visionarymind/anaconda3/lib/python3.8/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "calibration.py", line 780, in corner_detection_task
    ROI_pc = locate_chessboard(pc)
  File "calibration.py", line 392, in locate_chessboard
    segmentation = segmentation_ext.region_growing_kernel(
ValueError: vector::_M_default_append
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "calibration.py", line 943, in <module>
    calibration(keep_list=None)
  File "calibration.py", line 873, in calibration
    corners_world, final_cost, corners_image = detection_result[idx].get()
  File "/home/visionarymind/anaconda3/lib/python3.8/multiprocessing/pool.py", line 771, in get
    raise self._value
ValueError: vector::_M_default_append

I have heard of this happening before with very large datasets, but a 90k point cloud should not be a problem. Would you have any idea how to get around this? It happens even if we setup a Conda environment with Python 2.7 and allow it to solve all dependencies.

Perhaps you could offer a pre-configured Conda environment YAML file that we could use to ensure all the right libraries are installed? I do not think this is a problem with library contention, but I want to make sure. I have already spent nearly a week attempting to get this working with variant library setups.

VisionaryMind commented 3 years ago

This problem is the result of this project now being compatible with Ubuntu 20.04 with PCL 1.8 installed. We set up a virtual machine with 18.04 Bionic, minding dependencies, and data collection now proceeds without a problem. Unfortunately, after changed you have made within the last 6 months, the tool is no longer able to detect corners or product extrinsics. I will follow up with an additional issue for this.