Open mgmike opened 4 months ago
I found a workaround. I added a .compute to the lidar_calibration_df,
lidar_calibration_df = lidar_calibration_df[lidar_calibration_df['key.laser_name'] == 1].compute()
and rebuilt the LiDARCalibrationComponent adding .tolist() to the extrinsic transform and beam_inclination values.
temp_tfm = v2.column_types.Transform
temp_tfm.transform = lidar_calibration.extrinsic.transform.tolist()[0]
temp_bic = v2.perception.context.BeamInclination
temp_bic.min = lidar_calibration.beam_inclination.min
temp_bic.max = lidar_calibration.beam_inclination.max
temp_bic.values = lidar_calibration.beam_inclination.values.tolist()[0]
lc2 = v2.perception.context.LiDARCalibrationComponent(lidar_calibration.key, temp_tfm, temp_bic)
This doesn't seem like the correct solution though. I must be missing something, there's gotta be a better way to do this right?
Hi, I am trying to get the points from lidar data and am having issues with v2.perception.utils.lidar_utils.convert_range_image_to_point_cloud.
I am using python3.10 and tensorflow 2.12 in a docker container with a host system Ubuntu 20.04. Tensorflow is able to see my host gpu and I tested by conducting an operation on two tensors using the gpu.
It looks like even after converting lidar_calibration from a dataframe to LiDaRCalibrationComponent, the extrinsic transform is still a dataframe Series The error below says that it failed to convert NumPy array to Tensor. Also below, the type of lidar_calibration.extrinsic is a Transform which looks correct according to context.py but lidar_calibration.extrinsic.transform is a Series. I dont have any experience with dask dataframes but could this be the issue, that tensorflow is expecting a numpy array but is getting a dask Series?
My code to get the LiDaRCalibrationComponent is as follows:
I have another cell which is where the issue appears:
And the output is: