SKrisanski / FSCT

GNU General Public License v3.0
140 stars 59 forks source link

TypeError: 'numpy.float64' object cannot be interpreted as an integer #38

Open BoonlueKac opened 5 months ago

BoonlueKac commented 5 months ago

I ran in Windows 10 Anaconda. try to ues las data from Test data TypeError: 'numpy.float64' object cannot be interpreted as an integer Do you have any suggestion ?

FSCT-main) C:\FSCT\scripts>python run.py Current point cloud being processed: C:/FSCT/data/test/example.las Using default number of CPU cores (all of them). Processing using 16 / 16 CPU cores. Loading file... C:/FSCT/data/test/example.las Saving file: C:\FSCT\data\test/example_FSCT_output/working_point_cloud.las Saved. Pre-processing point cloud... Preprocessing took 4.8439555168151855 s Preprocessing done

Is CUDA available? True Performing inference on device: cuda 0/108 2/108 4/108 6/108 8/108 10/108 12/108 14/108 16/108 18/108 20/108 22/108 24/108 26/108 28/108 30/108 32/108 34/108 36/108 38/108 40/108 42/108 44/108 46/108 48/108 50/108 52/108 54/108 56/108 58/108 60/108 62/108 64/108 66/108 68/108 70/108 72/108 74/108 76/108 78/108 80/108 82/108 84/108 86/108 88/108 90/108 92/108 94/108 96/108 98/108 100/108 102/108 104/108 106/108 108/108 Loading file... C:/FSCT/data/test/example_FSCT_output/working_point_cloud.las Choosing most confident labels... Saving file: C:/FSCT/data/test/example_FSCT_output/segmented.las Saved. Semantic segmentation took 37.000956535339355 s Semantic segmentation done Loading segmented point cloud... Loading file... C:/FSCT/data/test/example_FSCT_output/segmented.las Making DTM... DTM Done Saving file: C:/FSCT/data/test/example_FSCT_output/DTM.las Saved. Plot area is approximately 0.003600000000003154 ha Getting heights above DTM... Saving file: C:/FSCT/data/test/example_FSCT_output/terrain_points.las Saved. Saving file: C:/FSCT/data/test/example_FSCT_output/stem_points.las Saved. Saving file: C:/FSCT/data/test/example_FSCT_output/vegetation_points.las Saved. Saving file: C:/FSCT/data/test/example_FSCT_output/cwd_points.las Saved. Saving file: C:/FSCT/data/test/example_FSCT_output/segmented_cleaned.las Saved. Post-processing took 1.8837203979492188 seconds Post processing done. Loading file... C:/FSCT/data/test/example_FSCT_output/stem_points.las stempoints ['x', 'y', 'z', 'red', 'green', 'blue', 'label', 'height_above_DTM'] Loading file... C:/FSCT/data/test/example_FSCT_output/DTM.las Loading file... C:/FSCT/data/test/example_FSCT_output/terrain_points.las Loading file... C:/FSCT/data/test/example_FSCT_output/vegetation_points.las Saving file: C:/FSCT/data/test/example_FSCT_output/ground_veg.las Saved. Loading file... C:/FSCT/data/test/example_FSCT_output/cwd_points.las Canopy Cover Fraction: 0.7836691410392365 Understory Veg Fraction: 0.8886532343584306 Coarse Woody Debris Fraction: 0.40402969247083775 Making and clustering slices... 0 / 485Traceback (most recent call last): File "C:\FSCT\scripts\run.py", line 54, in FSCT( File "C:\FSCT\scripts\run_tools.py", line 44, in FSCT measure1.run_measurement_extraction() File "C:\FSCT\scripts\measure.py", line 833, in run_measurement_extraction cluster, skel = MeasureTree.slice_clustering(new_slice, self.parameters["min_cluster_size"]) File "C:\FSCT\scripts\measure.py", line 778, in slice_clustering new_slice = cluster_hdbscan(new_slice[:, :3], min_cluster_size) File "C:\FSCT\scripts\tools.py", line 218, in cluster_hdbscan cluster_labels = hdbscan.HDBSCAN(min_cluster_size=min_cluster_size).fitpredict(points[:, :3]) File "C:\Users\Boonlue\anaconda3\envs\FSCT-main\lib\site-packages\hdbscan\hdbscan.py", line 1243, in fitpredict self.fit(X) File "C:\Users\Boonlue\anaconda3\envs\FSCT-main\lib\site-packages\hdbscan\hdbscan.py", line 1205, in fit ) = hdbscan(cleandata, **kwargs) File "C:\Users\Boonlue\anaconda3\envs\FSCT-main\lib\site-packages\hdbscan\hdbscan.py", line 884, in hdbscan _tree_tolabels( File "C:\Users\Boonlue\anaconda3\envs\FSCT-main\lib\site-packages\hdbscan\hdbscan.py", line 78, in _tree_to_labels condensed_tree = condense_tree(single_linkage_tree, min_cluster_size) File "hdbscan\_hdbscan_tree.pyx", line 43, in hdbscan._hdbscan_tree.condense_tree File "hdbscan\_hdbscan_tree.pyx", line 109, in hdbscan._hdbscan_tree.condense_tree TypeError: 'numpy.float64' object cannot be interpreted as an integer

(FSCT-main) C:\FSCT\scripts>

BoonlueKac commented 5 months ago

I updated hdbscan to last version. It is OK.

BoonlueKac commented 4 months ago

Thanks.