Open ghost opened 3 years ago
There are a few options here. You could go with conservative parameters that under select trees, then manually add them using the map.pick function, or you could use nearest neighbor metrics using the fastPointMetrics function to preselect possible tree regions. I normalize, then voxelize, then apply fastPointMetrics using Verticality and Eigentropy, then filter the voxelized point cloud by high Verticality (between 80 and 95) and low eigentropy (below 0.03), then use the filtered point cloud with the treeMap function and map.hough.
If you want to get rid of the ground points, it would be: tls = tlsNormalize(tls, keep_ground = F)
Pro tip on the normalization, after normalizing, filter the points less then zero with the following (tls being the original point cloud) :
tls <- filter_poi(tls, Z > 0L)
@spokswinski Thank you for your contribution.
Even using the tls = tlsNormalize (tls, keep_ground = F)
function, it does not remove the ground points.
Could you please share your code with me as an example?
I didn't understand how to use these parameters:
INDEX METRIC ENABLED
1 1 N TRUE
2 2 MinDist FALSE
3 3 MaxDist FALSE
4 4 MeanDist TRUE
5 5 SdDist FALSE
6 6 Linearity FALSE
7 7 Planarity FALSE
8 8 Scattering FALSE
9 9 Omnivariance FALSE
10 10 Anisotropy FALSE
11 11 Eigentropy FALSE
12 12 EigenSum FALSE
13 13 Curvature TRUE
14 14 KnnRadius FALSE
15 15 KnnDensity FALSE
16 16 Verticality TRUE
17 17 ZRange FALSE
18 18 ZSd FALSE
19 19 KnnRadius2d FALSE
20 20 KnnDensity2d FALSE
21 21 EigenSum2d FALSE
22 22 EigenRatio2d FALSE
23 23 EigenValue1 FALSE
24 24 EigenValue2 FALSE
25 25 EigenValue3 FALSE
26 26 EigenVector11 FALSE
27 27 EigenVector21 FALSE
28 28 EigenVector31 FALSE
29 29 EigenVector12 FALSE
30 30 EigenVector22 FALSE
31 31 EigenVector32 FALSE
32 32 EigenVector13 TRUE
33 33 EigenVector23 TRUE
34 34 EigenVector33 TRUE
This is my code:
las = readLAS(file)
x = plot(las)
tls = tlsNormalize(las, keep_ground = F)
tls <- filter_poi (tls, Z> 0L)
thin = tlsSample(tls, smp.voxelize(spacing= 0.05))
all_metrics = fastPointMetrics.available ()
my_metrics = all_metrics [c (11,16)]
tls = fastPointMetrics(thin, ptm.knn(10),my_metrics)
map = treeMap(thin,map.hough(min_h = 1.5, max_h = 4, min_density = 0.00001), positions_only = F)
a = treeMap.merge(map, d= 0.5)
map
length(a$TreeID)
add_treeMap(x, map, color='yellow', size=4)
tls = treePoints(tls, map, trp.crop())
add_treePoints(x, tls, size=4)
add_treeIDs(x, tls, cex = 2, col='yellow')
tls = stemPoints(tls, stm.hough(min_density = 0.00001))
add_stemPoints(x, tls, color='red', size=1)
inv= tlsInventory (
tls,
d_method = shapeFit(shape = "circle", algorithm = "ransac")
)
seg = stemSegmentation(tls, sgt.ransac.circle(n = 15))
add_stemSegments(x, seg, color='white', fast=T)
df = data.frame(inv)
df
Thanks
Okay. We add a few things to this.
There are some points that are left below the ground as an artifact of how the ground points are classified. We use the following to get rid of that:
tls = tlsNormalize(tls, keep_ground = F) tls <- filter_poi(tls, Z > 0L)
Then, to use the two of the parameters you called with all_metrics, we pick out 16 (Verticality) and 11 (Eigentropy) to highlight probable locations of stems using those two metrics and the result will be a point cloud called "Stems". We use 50 closest points in ptm.knn which may be overkill and may slow down your processing depending on your machine.
all_metrics = fastPointMetrics.available() my_metrics = all_metrics[c(16, 11)] Stems = fastPointMetrics(thin, ptm.knn(50), my_metrics)
After that, we filter by values of interest. (high verticality and low eigentropy) and create a tree map using the treeMap function.
tlsfilter <- filter_poi(Stems, Verticality > 80, Verticality < 95) tlsPlot(tlsfilter) tlsfilter <- filter_poi(tlsfilter, Eigentropy < .03) tlsPlot(tlsfilter)
map = treeMap(tlsfilter, map.hough(min_h = 2, max_h= 4, min_votes = 1), merge = 0)
From there we plot the point cloud and classify tree regions, then stem points
x = plot(tls) add_treeMap(x, map, color='yellow', size=2)
classify tree regions tls = treePoints(tls, map, trp.crop()) treepoints <- add_treePoints(x, tls, size=4) plot(treepoints) add_treeIDs(x, tls, cex = 2, col='yellow')
classify stem points
tls = stemPoints(tls, stm.hough(h_step = 0.2, h_base = c(0.05, 0.25), min_votes = 1)) add_stemPoints(x, tls, color='red', size=8)
@tiagodc @caiohamamura
I'm new here and I'm having a lot of difficulty.
I don't work with high dot density, so I tested several parameters to try to find the best result.
the classifier often finds more than one tree at the same location. That's why I applied the treeMap.merge feint, but now it always identifies one less tree than it should.
At the same time, while setting minimum and maximum height in map.hough(<h_min, h_max, min_density>), it selects even fewer trees, so I need to reduce my density even more. (), It still doesn't identify everything.
Could you suggest an example to me so that I can better understand this process.
another thing that I find interesting is that I do the normalization, but when I ask to remove the ground points () it doesn't do it.
in addition to appearing as an error message. apparently the point cloud was not normalized