Open mdsumner opened 6 years ago
That search interface is unfortunately a bit too intractable for our current methods.
I put the current collection into
data_local/www.avl.class.noaa.gov/saa/products/AVHRR_L1b/
The AVHRR_L1b is a dir name made up by me, it contains 1978/ 1979/ and 1994/ based on .D%y, i.e. NSS.GHRR.ND.D94182 is 1994 jday 182.
The extensions are WI or GC, which are the receiving stations (you can read this stuff in the gdalinfo output).
afaik we must use the website search to get a file listing, and this can only be done one year at a time - I think it's about 32Gb per year for all so a fun rainy day task.
This code is enough to build the gdalwarp command for a target raster. I believe GDAL 2.3.2 is better than 2.2.3 for this task - and possibly it's an unfinished facility for some problems. The GCPs are a subsample of the grid itself.
GDAL uses the internal GCPs (also obtainable with vapour::vapour_raster_gcp
in pixel, line, x, y, z form.
#' Generate gdalwarp command
#'
#' A raster object 'x' is used to input the extent, projection (CRS), and
#' resolution (in metres) of the target grid.
#'
#' The output is a templated string, which can be sprintf'ed to apply
#' input and output file name.
#'
#' Resolution is currently hardcoded at 1000x1000m, so that the input can
#' be any general grid that we are familiar with (WIP).
#'
#' @param x raster object, the reprojection target
#' @param resampling method, default is bilinear
#'
#' @return
#' @export
#'
#' @examples
generate_gdalwarp_l1b <- function(x, resampling = "bilinear") {
p4 <- projection(x)
te <- c(xmin(x), ymin(x), xmax(x), ymax(x))
tr <- res(x)
## input output resampling t_srs te tr
paste("gdalwarp %s %s", sprintf("-geoloc -r %s -t_srs '%s' -te %f %f %f %f -tr %f %f -co COMPRESS=LZW -dstnodata 0",
resampling,
p4,
te[1], te[2], te[3], te[4],
tr[1], tr[2]
))
}
#' Run GDAL warp
#'
#' All bands in the input are written to the output. The file is
#' compressed using LZW. Warping is done via geolocation arrays, which
#' GDAL just handles on its own.
#' @param input NOAA AVHRR filename
#' @param target raster object (composed of extent, crs, and resolution)
#' @param output output file name (*.tif) which is GeoTIFF
#' @param resampling resampling method, default is 'bilinear' (gdalwarp default is 'near')
#'
#' @return raster object from 'output' file
#' @export
#'
gdalwarp <- function(input, target, output, resampling = "bilinear", overwrite = FALSE) {
cmd <- sprintf(generate_gdalwarp_l1b(target, resampling = resampling), input, output)
if (overwrite) {
cmd <- paste(cmd, "-overwrite")
} else {
if (file.exists(output)) stop("output file exists, use 'overwrite = TRUE' or delete it")
}
tst <- system(cmd)
brick(output)
}
Would like to arrange access to these products:
https://www.avl.class.noaa.gov/saa/products/search?datatype_family=AVHRR
The target is (see screenshot below)
The files look like
Don't yet know how to parse out target files for a given region - but maybe there's not so much data that it matters - we have 98 files that cross near Mawson in 1979 and they total 60Mb (AFraser found by search with the website).
I don't know what GC and WI stand for or any of the other file tokens except "D94182" is "1994, day 182" .
This is the GDAL driver: https://gdal.org/frmt_l1b.html the files work well with gdalwarp, by specifying a target region and projection with the "-geoloc" option.
Application is fast ice detection (manual) from visual inspect of band 4 or 5.
gdalinfo -nogcp yields: