Closed echelleburns closed 1 year ago
All seems good except:
attribute variables are assumed to be spatially constant throughout all geometries
- can get rid of this by only using the sf objects geometry (st_geometry
) since we are not interested in the attributes anyway. Ideally clean up the code so that we don't get the message/ warning rather than just suppressing them!I was able to update the classify_bathymetry()
and get_enviro_regions()
scripts to have geometry as the last column (this is because I used terra::separate()
to create a stack)
For the warnings/errors issue, I was able to remove the following errors/messages:
attribute variables are assumed to be spatially constant throughout all geometries
(in get_knolls()
, get_seamount_peaks()
, get_seamounts_buffered()
, get_geomorphology()
): Fixed using the st_geometry
method you suggestedWarning message: [rast] the first raster was empty and was ignored
(in get_geomorphology
): Fixed by creating a raster with the first element instead of binding to a blank rasterThere are additional errors/warnings that were popping up that I'm not sure there's a way around, so I've suppressed them for now, but we can revisit if needed.
although coordinates are longitude/latitude, st_intersection assumes that they are planar
(in get_knolls()
, get_seamount_peaks()
, get_seamounts_buffered()
): This pops up based on the raw data that we're pulling, that is, by default, in lat/lon. I think the only way around this would be to first reproject both the raw data and the planning grid to a local CRS/projection. We tend to do that later in the script because we allow for options where the planning grid is not specified. Even so, users might still choose to keep data in WGS84. Thoughts on this? Warning message: In st_is_longlat(x) : bounding box has potentially an invalid value range for longlat data
(in get_knolls()
): This is also based on the raw data that we pull, as the bounding box extends into the latitude range of -181 and 181. I'm not entirely sure how we want to tackle this, since the code still runs properly without having to manually reset the bounding box/crop the data to +/- 180. Nice work on this. And I like your method for getting round the empty first element of the raster stack - I get this warning all the time but have never bothered figuring out a fix!
For the remaining messages: I think suppressing them is fine. As you said, we need the data in WGS84 and the function works ok. And I think the -181 to 181 is just some kind of quirk of the outputs that we can live with and won't cause issues as far as I can see.
Unfortunately I did realise there is a problem with the classify_depth
and get_enviro_regions
sf outputs: each depth or environmental regions should not overlap with another, but they do, e.g.
`depth_zones_square_sum <- dplyr::mutate(depth_zones_square, depth_zones_sum = rowSums(dplyr::pick(1:4), na.rm = TRUE))
plot(depth_zones_square_sum["depth_zones_sum"], border=FALSE)`
`enviro_square <- get_enviro_regions(area_polygon = area, planning_grid = planning_square, num_clusters = 3)
enviro_square_sum <- dplyr::mutate(enviro_square, enviro_sum = rowSums(dplyr::pick(1:3), na.rm = TRUE))
plot(enviro_square_sum["enviro_sum"], border = FALSE)`
Looks like there should be some kind of threshold of how much raster cell intersects the planning units??
Ok, just pushed an update.
exactextractr::exact_extract()
to convert from raster into sf planning grid; only grabs the value that has the highest coverage in each cell (this also introduces a status bar in the console and takes a little longer than before)get_bathymetry()
for an extra buffer around the planning area so that we didn't get any edge effects in the depth zones. Here's what the outputs should look like now:
devtools::load_all()
# Housekeeping
area <- get_area(area_name = "Bermuda")
projection <- 'PROJCS["ProjWiz_Custom_Lambert_Azimuthal", GEOGCS["GCS_WGS_1984", DATUM["D_WGS_1984", SPHEROID["WGS_1984",6378137.0,298.257223563]], PRIMEM["Greenwich",0.0], UNIT["Degree",0.0174532925199433]], PROJECTION["Lambert_Azimuthal_Equal_Area"], PARAMETER["False_Easting",0.0], PARAMETER["False_Northing",0.0], PARAMETER["Central_Meridian",-64.5], PARAMETER["Latitude_Of_Origin",32], UNIT["Meter",1.0]]'
bathymetry <- get_bathymetry(area_polygon = area)
# Create planning area options
planning_square <- get_planning_grid(area, projection, option = "sf_square")
# Depth zones
depth_zones_square <- classify_depths(bathymetry, planning_grid = planning_square)
depth_zones_square_sum <- dplyr::mutate(depth_zones_square, depth_zones_sum = rowSums(dplyr::pick(1:4), na.rm = TRUE))
plot(depth_zones_square_sum["depth_zones_sum"], border=FALSE)
# Environmental zones
enviro_square <- get_enviro_regions(area_polygon = area, planning_grid = planning_square, num_clusters = 3)
enviro_square_sum <- dplyr::mutate(enviro_square, enviro_sum = rowSums(dplyr::pick(1:3), na.rm = TRUE))
plot(enviro_square_sum["enviro_sum"], border = FALSE)
Nice work! Your code works fine for me. I had to fix classify_depths
for raster because the masking moved and the depth zone naming was giving errors; I've pushed the fix.
Couple of extra to dos/ points:
Whoops, thanks for catching/fixing the raster error. I had meant to check that it still worked but it slipped my mind.
Just pushed changes for the order of depth and enviro regions.
Can confirm that I am classifying first, then masking (for rasters)/using exact_extract()
(for sf)
Reordering works fine for me. I'm going to close this and start another issue for another sf related problem....
Please test the following functions to ensure they are working with the new sf capabilities.
Functions:
get_planning_grid()
classify_depths()
get_coral_habitat()
get_enviro_regions()
get_geomorphology()
get_knolls()
get_seamounts_buffered()
Local tests using the following code yielded appropriate results on my end.