bowlerbear / distributionChange

a project to explore the multi-dimensional nature of species' distribution changes
MIT License
0 stars 0 forks source link

hull calculations #3

Open bowlerbear opened 2 years ago

bowlerbear commented 2 years ago

there are several ways to calc the extent, so far I have used:

concaveman::concaveman(ydat)

adehabitatHR::mcp(ydat, percent=99)

alphahull::ahull(ydat, alpha = dist) where dist is dist <- max(ydat$y_MTB)-min(ydat$y_MTB)

We should stick to the IUCN as much as possible, they say: https://support.ala.org.au/support/solutions/articles/6000208436-area-of-occupancy-and-extent-of-occurrence

"Extent of occurrence can often be measured by a minimum convex polygon (the smallest polygon in which no internal angle exceeds 180 degrees and which contains all the sites of occurrence)"

but also

"The alpha hull value is used when calculating the Extent of Occupancy (EOO) for assessing conservation risk. It examines the distance between occurrence points and modifies the area to be included in the EOO based on a multiple (alpha - selected by the user) of the average distance between points. The lower the alpha value, the tighter the EOO will be around the occurrence points. For further information on alpha hulls, see: ICUN Standards and Petitions Subcommittee. 2017. Guidelines for Using the IUCN Red List Categories and Criteria. Version13. Prepared by the Standards and Petitions Subcommittee. Downloadable from https://www.iucnredlist.org/documents/RedListGuidelines.pdf (pp. 46-47)"

can you help figure out which is the best R package to use?

coreytcallaghan commented 2 years ago

Hmm. This is pretty interesting actually. And definitely a gray area based on my reading of the guidelines.

This part, specifically, seems a bit wishy-washy: image

I guess the question becomes if you want to go down the road of dijunct polygons (referred to as discrete patches in the guidelines) summed up to make the EOO. I.e., alpha hulls. Which is what is 'recommended' to some extent. I guess it depends on how common this would be in the dragonfly data. I just looked through some of the 'data' of the maps on your app (the estimate doesn't seem to be displaying for the few species I tried initially) and it doesn't look like it is super common where, for example a range is clearly in two disjunct parts of Germany. PLUS, you did a ton of work smoothing over noise and bias to produce the occupancy maps, and this probably is more likely to lead to 'contiguous ranges' for the majority of species? And maybe this is even some sort of deep implicit assumption? No idea. But especially given the relatively small spatial scale of Germany, then alpha hulls are maybe not as helpful in this instance?

So, all that being said, I personally would vote against the alpha hull method of possibly more than 1 disjunct polygon for a species EOO.

And thus, I would go with either a convex hull or concave hull. But, as it mentions the convex hull can have biases associated with it when comparing time points. Which is why I actually really like the concave hull approach. To me it is a happy medium.

However, potentially if you want to be slightly more in line with IUCN guidelines, the convex hull is best. And our justification could be this sentence:

Nevertheless, given the paucity of practical methods applicable to all spatial distributions, and the need to estimate EOO consistently across taxa, minimum convex polygon remains a pragmatic measure of the spatial spread of risk.

Either way I would use the concaveman implementation. You can adjust the concavity, and calculate a convex hull at Inf.

Here is an example:

library(concaveman)
data(points)
plot(points)
plot(concaveman(points, concavity=1))
plot(points, add=TRUE)

plot(concaveman(points, concavity=4))
plot(points, add=TRUE)

# it doesn't actually take Inf as a value, 
#but can just put some massive number
plot(concaveman(points, concavity=1000000))
plot(points, add=TRUE)

I guess a starting point would be to plot a scatterplot of concavity=2 and concavity=1000000000? I assume they will be strongly correlated in this instance.

coreytcallaghan commented 2 years ago

I feel like that is a bit of a very classic Corey rambling non-answer... But maybe helpful for my 2 cents. idk.

I've used the mcp approach before too, but presumably these things would have little effect on your findings.