Closed fab4ap closed 3 years ago
Thank you for reporting this. This may be caused by using terra objects that have been saved to disk, e.g. with a previous session. Can you reproduce it in a fresh R session --- that is without loading data from previous sessions? If you can reproduce it, it would be great if you can share the data required or point to it (you can email me).
Will do and let you know. Trying to switch to sf and terra! Thanks very much. Prasad
On Wed, Mar 3, 2021 at 6:09 PM Robert Hijmans notifications@github.com wrote:
Thank you for reporting this. This may be caused by using terra objects that have been saved to disk, e.g. with a previous session. Can you reproduce it in a fresh R session --- that is without loading data from previous sessions? If you can reproduce it, it would be great if you can share the data required or point to it (you can email me).
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rspatial/terra/issues/164#issuecomment-790138974, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI2ARB5YRFXPHD4KIDMP3YTTB26TXANCNFSM4YR77ZFA .
Yes - it looks like that was the problem. I could not reproduce the error when I started fresh. The only difference is that I used R 4.0.4 this time (that should not make a difference I think since I'm using the same version of terra). Looks like the error is documented for stored terra objects. Will inform if I encounter it again. Also, I still have to crisscross among raster, sp, sf and terra in my work. Terra's speed compared to raster is amazing . Here are my notes that may help terra:
1) elva = raster::projectRaster(elv, res=1000, crs=usaea, method="bilinear") # Project lcc to usaea; Wish there was this equivalent in terra's project()
2) cntyElv_cv = terra::extract(elvausc, uscntysv, fun=function(x, ...) { round((sd(x)/mean(x))*100)}, method="simple", touches=F) # Need to join the dataframe to the main cover - wish there was a sp =T option like raster::extract Thanks very much! :)
Thanks for letting me know and for your suggestions
Instead of this
elva = raster::projectRaster(elv, res=1000, crs=usaea, method="bilinear")
# Project lcc to usaea; Wish there was this equivalent in terra's project()
You can do
elva <- raster::projectRaster(elv, usaea, method="bilinear")
I could of course add the res=
argument. I did not initially do that as I think it is better to explicitly set the extent and resolution by providing another SpatRaster as second argument. Assuring that raster data is well aligned is of prime importance; and a lot trouble ensues from not doing that.
Right now, you can do
# create an empty template
temp <- raster::projectRaster(rast(elv), usaea)
res(temp) <- 1000
# project using template
elva <- raster::projectRaster(elv, temp, method="bilinear")
But if do not care about alignment, that is three steps instead of one...
This one
cntyElv_cv = terra::extract(elvausc, uscntysv, fun=function(x, ...) { round((sd(x)/mean(x))*100)}, method="simple", touches=F)
# Need to join the dataframe to the main cover - wish there was a sp =T option like raster::extract
Seems unnecessary as you can do
elvausc$cntyElv_cv <- terra::extract(elvausc, uscntysv, fun=function(x, ...) { round((sd(x)/mean(x))*100)}, method="simple", touches=F)
Thanks so much. Yes, sp=T is unnecessary when you can do something nifty like what you have suggested! Also, I'd like to report another issue, this time with terra::zonal(). It looks like a bug. I report the output from terra::zonal and raster::zonal below with comments. Pls. let me know if you need more details. Also, the CppMethod_invoke_notvoid issue pops up every time I save image to disk and retrieve it in the next session, forcing me to use raster - is there a workaround/solution?
Elevation raster: elvusam # class : SpatRaster # dimensions : 5804, 9224, 1 (nrow, ncol, nlyr) # resolution : 500, 500 (x, y) # extent : -2354530, 2257470, -1336771, 1565229 (xmin, xmax, ymin, ymax) # coord. ref. : +proj=aea +lat_0=37.5 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs # source : memory # name : elv # min value : -102.2507 # max value : 4244.14 Zonal raster: uscnty0p5k # class : SpatRaster # dimensions : 5804, 9224, 1 (nrow, ncol, nlyr) # resolution : 500, 500 (x, y) # extent : -2354530, 2257470, -1336771, 1565229 (xmin, xmax, ymin, ymax) # coord. ref. : +proj=aea +lat_0=37.5 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs # source : memory # name : cntytr # min value : 1001 # max value : 56045 cntyavgelv = terra::zonal(elvusam, uscnty0p5k, mean) # FLAG: takes ~ 5+ minutes ; raster::zonal is much faster - pls. see below summary(cntyavgelv) zone elv Min. : 1001 Min. : 5.576 1st Qu.:19048 1st Qu.: 174.823 Median :29217 Median : 295.595 Mean :30699 Mean : 472.160 3rd Qu.:46012 3rd Qu.: 518.807 Max. :56045 Max. :3466.306 NA's :276 #### FLAG: has lots of NAs #### Try raster::zonal cntyavgelvr = raster::zonal(raster(elvusam), raster(uscnty0p5k), mean) #### NOTE: takes less than a minute ! summary(cntyavgelvr) zone value Min. : 1001 Min. : 1.756 1st Qu.:19048 1st Qu.: 144.489 Median :29217 Median : 278.044 Mean :30699 Mean : 442.102 3rd Qu.:46012 3rd Qu.: 485.775 Max. :56045 Max. :3466.306 NA's :1 NA's :1 NOTE: #### The output from raster::zonal looks more reasonable
Is it not due to a different default NA handling? What you will get if you do: cntyavgelv = terra::zonal(elvusam, uscnty0p5k, mean,na.rm = TRUE) ?
Aha! You are right - raster::zonal has na.rm=T by default and terra::zonal doesn't - and that tripped me. However, raster::zonal is faster. I am finding some issues with terra::extract also - with all the edge polygons reporting NA (in spite of touches argument). I will probe into that more (probably a simple issue related to NAs). Thanks very much.
On Fri, Mar 5, 2021 at 6:18 PM Monika Anna Tomaszewska < notifications@github.com> wrote:
Is it not due to a different default NA handling? What you will get if you do: cntyavgelv = terra::zonal(elvusam, uscnty0p5k, mean,na.rm = TRUE) ?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rspatial/terra/issues/164#issuecomment-791783768, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI2ARB2C4LPEM7RJMBWE5BLTCFRDTANCNFSM4YR77ZFA .
I am using na.rm=FALSE
because that is what base R functions have --- but I have my doubts, you would almost never want to use zonal or extract with na.rm=F.
Thanks for pointing out that terra::zonal is slower than raster::zonal --- that should not be necessary; and I will have a look.
As for the terra::extract, much more efficient (in terms of time) for me was to just extract pixels (in that case from rasterstack) using one *.shp file with hundreds of polygons/features, and then apply an aggregate function to calculate statistics based on polygon ID. Maybe you will find this helpful. It was faster than using a function to calculate statistic within terra:extract, but in general (regardless of statistic calculation approach) terra:extract was much faster than raster:extract.
extract_data <- terra::extract(stack, buffer) mean_values <- apply(extract_data[,2:11], 2, function(x){aggregate(x ~ ID, data = extract_data, FUN = mean,na.rm=TRUE,na.action = na.pass)}) m_v_df <- data.frame(Reduce(cbind, mean_values)) #because I have worked on rasterstack and wanted to get away from a list of lists.
Aha, and I was only interested in pixels with a center point within the polygon so no 'touches' argument.
Thank you so much for these helpful responses. This motivates me to continue using terra and report any issues. I was about to give up on it! It certainly is poised to replace raster. I will encourage more in our team to use it, to expose any hidden bugs.
On Sat, Mar 6, 2021 at 5:18 PM Monika Anna Tomaszewska < notifications@github.com> wrote:
As for the terra::extract, much more efficient (in terms of time) for me was to just extract pixels (in that case from rasterstack) using one .shp file with hundreds of polygons/features, and then apply an aggregate function to calculate statistics based on polygon ID. Maybe you will find this helpful. It was faster than using a function to calculate statistic within* terra:extract, but in general (regardless of statistic calculation approach) terra:extract was much faster than raster:extract.
extract_data <- terra::extract(stack, buffer) mean_values <- apply(extract_data[,2:11], 2, function(x){aggregate(x ~ ID, data = extract_data, FUN = mean,na.rm=TRUE,na.action = na.pass)})
Aha, and I was only interested in pixels with a center point within the polygon so no 'touches' argument.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rspatial/terra/issues/164#issuecomment-792072834, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI2ARB3BBDGRKR4XRJD7O23TCKS2XANCNFSM4YR77ZFA .
In my tests, terra::zonal
is faster than raster::zonal
when there are few zones. In the example below about 13 times.
library(terra)
f <- system.file("ex/elev.tif", package="terra")
r <- rast(f)
r <- disaggregate(r, 25, filename="temp.tif", overwrite=TRUE)
f <- system.file("ex/lux.shp", package="terra")
v <- vect(f)
z <- rasterize(v, r, "NAME_2")
system.time(zonal(r, z, mean, na.rm=TRUE))
# user system elapsed
# 0.12 0.11 0.24
library(raster)
rr <- raster(r)
zz <- raster(z)
system.time(zonal(rr, zz, mean, na.rm=TRUE))
# user system elapsed
# 3.16 0.04 3.20
But this changes when there are many zones (compare US states with US counties). raster is faster for the counties --- and not affected by the number of zones.
# remotes::install_github("rspatial/geodata")
library(geodata)
elev <- geodata::elevation_30s("USA", path=".")
cnts <- geodata::gadm("USA", level=2, path=".")
# US states
zones <- rasterize(cnts, elev, "NAME_1")
system.time(z1 <- zonal(elev, zones, mean, na.rm=TRUE))
# user system elapsed
# 2.20 0.29 2.48
relev <- raster(elev)
rzones <- raster(zones)
system.time(z2 <- zonal(relev, rzones, "mean"))
# user system elapsed
# 15.03 0.27 15.32
# US counties
zones <- rasterize(cnts, elev, "NAME_2")
system.time(z <- zonal(elev, zones, mean, na.rm=TRUE))
system.time(z <- zonal(elev, zones, mean, na.rm=TRUE))
# user system elapsed
# 55.83 0.24 56.19
rzones = raster(zones)
system.time(z2 <- zonal(relev, rzones, "mean"))
# user system elapsed
# 15.80 0.36 16.20
raster::zonal is significantly faster in my test. Pls. see below. Is it because elevation is not in memory?
elvusam # Elevation in m class : SpatRaster dimensions : 5804, 9224, 1 (nrow, ncol, nlyr) resolution : 500, 500 (x, y) extent : -2354530, 2257470, -1336771, 1565229 (xmin, xmax, ymin, ymax) coord. ref. : +proj=aea +lat_0=37.5 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs source : elvusamr2.tif name : elv min value : -102.2507 max value : 4244.14
uscnty0p5k # US counties (FIPS) class : SpatRaster dimensions : 5804, 9224, 1 (nrow, ncol, nlyr) resolution : 500, 500 (x, y) extent : -2354530, 2257470, -1336771, 1565229 (xmin, xmax, ymin, ymax) coord. ref. : +proj=aea +lat_0=37.5 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs source : memory name : cntytr min value : 1001 max value : 56045
system.time(terra::zonal(elvusam, uscnty0p5k, mean, na.rm=T)) # NOTE: na.rm=F by default user system elapsed 282.20 1.03 283.96
system.time(raster::zonal(raster(elvusam), raster(uscnty0p5k), mean)) # NOTE: na.rm=T by default user system elapsed 41.37 1.93 43.42
On Sun, Mar 14, 2021 at 12:11 AM Robert Hijmans @.***> wrote:
In my tests, terra::zonal is much faster than raster::zonal. In the example below about 13 times.
f <- system.file("ex/elev.tif", package="terra") r <- rast(f) r <- disaggregate(r, 25, filename="temp.tif", overwrite=TRUE)
f <- system.file("ex/lux.shp", package="terra") v <- vect(f) z <- rasterize(v, r, "NAME_2")
system.time(zonal(r, z, mean, na.rm=TRUE))
user system elapsed
0.12 0.11 0.24
library(raster) rr <- raster(r) zz <- raster(z) system.time(zonal(rr, zz, mean, na.rm=TRUE))
user system elapsed
3.16 0.04 3.20
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rspatial/terra/issues/164#issuecomment-798834263, or unsubscribe https://github.com/notifications/unsubscribe-auth/AI2ARB5E6CCUPS65O4GIU73TDQZOHANCNFSM4YR77ZFA .
Is it because elevation is not in memory?
I am pretty sure it is because you have many zones; there are more than 3000 US counties. I need to fix the code to avoid this from happening.
In my test, terra
is now a little bit faster than raster
when using many zones in zonal
. No doubt further improvements are possible, but cannot yet spend much time on code optimization.
# remotes::install_github("rspatial/geodata")
library(geodata)
elev <- geodata::elevation_30s("USA", path=".")
cnts <- geodata::gadm("USA", level=2, path=".")
## US states
## terra
zones <- rasterize(cnts, elev, "NAME_1")
system.time(z1 <- zonal(elev, zones, mean, na.rm=TRUE))
# user system elapsed
# 1.30 0.26 1.56
## raster
relev <- raster(elev)
rzones <- raster(zones)
system.time(z2 <- zonal(relev, rzones, "mean"))
# user system elapsed
# 15.42 0.34 15.79
## US counties
zones <- rasterize(cnts, elev, "NAME_2")
## terra
system.time(z <- zonal(elev, zones, mean, na.rm=TRUE))
# user system elapsed
# 13.15 0.47 13.66
## raster
rzones = raster(zones)
system.time(z2 <- zonal(relev, rzones, "mean"))
# user system elapsed
# 15.61 0.42 16.05
Returning to the origin of this issue, I get today the same error (the complete error here below) just after installing and using terra yesterday. Removing the package and restarting R and reinstalling terra solves the issue. However, IMHO this is not a sustainable workflow. Difficult now to reproduce the error, but I will still use the package during the next days and if it happens again, I will be report more details about it.
Error in .External(list(name = "CppMethod__invoke_notvoid", address = <pointer: (nil)>, :
NULL value passed as symbol address
In addition: Warning message:
In class(object) <- "environment" :
Setting class(x) to "environment" sets attribute to NULL; result will no longer be an S4 object
I received this same error. After a lot of searching and head scratching, I deleted the .Rproj.user file for the project I was working on and that appears to have resolved the error. In windows .Rproj.user is a hidden file.
R version 4.2.1 (2022-06-23) Platform: "x86_64-w64-mingw32 Edition Windows 10 Pro Version 21H2 Installed on 2/6/2022 OS build 19044.2130
@rhijmans, you are right the the issue is due to terra objects that have been saved to disk with a previous session. But I believe opening R project in fresh R session every time is not the only solution.
@JeffreyASmith, deleting the .Rproj.user file for the project is not resolving the issue in my case.
So, please suggest a sustainable solution. Thank you.
Spat* objects saved in an R session cannot be recovered. I do not think that is a problem because saving R sessions is really bad practice. If you wanted to make this possible anyhow you would need to propose changes to base R save
and load
functions.
@rhijmans, thank you for the clear and convincing answer.
I'm using R 4.0.2 and terra 1.0.10 in Windows 10. ntymsdelvt = terra::extract(elvausc, uscntysv, fun=sd, method="bilinear") # Extracting elevation based on US counties Error in .External(list(name = "CppMethod__invoke_notvoid", address = <pointer: (nil)>, : NULL value passed as symbol address
After this, the error cropped up randomly for other operations. Also, some of the terra objects have been corrupted as it reports the same error. Looks like a bug.