unmannedlab / LiDAR-reflectivity-segmentation

Official implementation of "Reflectivity is all you need!: Advancing LiDAR semantic segmentation"
MIT License
13 stars 0 forks source link

questions about calculating angles and 'eta_fit_grass.npy' #2

Open hanin97 opened 7 months ago

hanin97 commented 7 months ago

Hi, thank you for your interesting work, I have some questions.

  1. Since angels can be calculated as Intensity(R,rho,alpha)/maxIntensity(r,rho,alpha) at every range given points and labels, why we need an alpha_predictor to predict angels?
  2. We can see 'p = np.poly1d(fit)' at line 99 in file 'data_generator.py', but parameter 'p' seems not being used in following code, the same as in 'data_generator_sem_poss.py' and 'data_generator_kitti.py'
kasiv008 commented 7 months ago

Thank you for the question. 1# Using the equation, angles can be computed only when the classes/labels are known. If we are given a point cloud to segment, we would need the alpha predictor to predict the angles as there is not labels/class given.

2# The polyfit is to remove near-range effect. apparently the LiDAR used in Semantic Kitti (HDL-64E) has already been calibrated for near-range effect. the same goes with Pandora LiDAR used in SemanticPOSS. Hence the fitting function has been commented.

hanin97 commented 7 months ago

Thank you for your reply, but I still have some questions about question1,

  1. I have checked the inference code which can be found at ·‘tasks/semantic/modules/user2.py’, we can see that the function infer_subset doesn't need the parameter near_range returned by the dataset and the parameter proj doesn't contain proj_reflectivity nor proj_near_range while reflectivity_flag=False' andnear_range_flag=False` in salsanextearly_rxyzi2.yaml. Does it mean that while inferring a point cloud, there's no need to get the angles?