Open mzarowka opened 1 year ago
Can you confirm that the below are the relevant steps in your script to get to the error? Here I am using a SpatRaster with no cell values, so the memory problem will not occur.
wref <- rast(ncol=1600, nrow=35000, nlyr=3, xmin=0, xmax=1600, ymin=0, ymax=35000)
roi <- ext(c(340, 1560, 700, 34775))
e <- c(as.vector(roi)[1:2], as.vector(ext(wref)[1:2]))
x <- crop(wref, e)
x <- aggregate(x, fact=c(nrow(x), 1), fun = "mean")
ext(x) <- roi
y <- disagg(x, fact=c(ymax(x), 1))
#class : SpatRaster
#dimensions : 34775, 1220, 3 (nrow, ncol, nlyr)
#resolution : 1, 0.9798706 (x, y)
#extent : 340, 1560, 700, 34775 (xmin, xmax, ymin, ymax)
#coord. ref. :
And can you please show(whiteref)
so that it is clearer what you are dealing with?
Hi,
I run your code with my whiteref and as you wrote, these are relevant steps leading to the std::bad_alloc
wref <- terra::rast("...capture/WHITEREF_PRI_22_PRI_22_AB_2023-01-16_13-44-50.raw")
roi <- ext(c(340, 1560, 700, 34775))
e <- c(as.vector(roi)[1:2], as.vector(ext(wref)[1:2]))
x <- crop(wref, e)
x <- aggregate(x, fact=c(nrow(x), 1), fun = "mean")
ext(x) <- roi
y <- disagg(x, fact=c(ymax(x), 1))
Error: std::bad_alloc---------|---------|
Raw white ref looks like this show(whiteref)
class : SpatRaster
dimensions : 100, 2184, 476 (nrow, ncol, nlyr)
resolution : 1, 1 (x, y)
extent : 0, 2184, 0, 100 (xmin, xmax, ymin, ymax)
coord. ref. :
source : WHITEREF_PRI_22_PRI_22_AB_2023-01-16_13-44-50.raw
names : 397.65, 398.84, 400.02, 401.21, 402.39, 403.58, ...
Hi, disclaimer - I'm not actually an R programmer so maybe I'm missing something simple - probably I'm disaggregating in a wrong way?
I'm using terra to work with hyperspectral rasters of geological material (sediment cores). These are pretty big files with around 400 to almost 500 wavelengths (so number of layers) that can easily reach more than 60 GB. Terra is perfect solution, as operations with that kind of data in memory are rather hard and we are aiming at different hardware setups - much appreciation for your hard work devs!
At one point so called white reference (small white object) has to match exactly extent of captured data (long sediment core). We achieve it by first cropping it to the same width, then aggregating reference file into one mean row, and then disaggregating this mean row it to match captured data extent and cell size.
This works fine with smaller subsets and datasets of approx. 400 lyrs, but with datasets of approx. 500 lyrs it cannot allocate space. With subsets around 50-100 lyrs it works really fast.
I'm not sure how to provide more reproducible example, because failure occurs with very large datasets.
I tried using different tmp directories on SSD drives and making sure there is more than required space available, and increasing number of steps with:
terra::terraOptions(steps = 1000, tempdir = "e:/tmp")
.These are the custom functions that are used together:
Chained together
This leads to
I'm working on Windows 11 machine with 32 GB RAM.