Closed n-verde closed 2 years ago
Hello there!
I figured out that adding the filepath variable to the twdtwApplyParallel and twdtwApply functions removes the error, but I am having another issue now. Code fails at the same functions (twdtwApply and twdtwApplyParallel) with the error:
Error in rng[1, ] : incorrect number of dimensions.
Any idea what might be wrong?
Thanks again,
Natalie
Hello there!
I figured out that adding the filepath variable to the twdtwApplyParallel and twdtwApply functions removes the error, but I am having another issue now. Code fails at the same functions (twdtwApply and twdtwApplyParallel) with the error:
Error in rng[1, ] : incorrect number of dimensions.
Any idea what might be wrong?
Thanks again,
Natalie
@n-verde can you provide the data again so that I can reproduce the error?
Hello Victor,
I have restored the folder with the data in Google Drive, so I hope you can find them with the same link. As a mentioned in a previous post, the errors Error in [<-(tmp, rows, , value = d$value$value[[l]]) : subscript out of bounds and Error in [<-(tmp, rows, , value = B[[l]]) : subscript out of bounds in twdtwApply were solved by adding the filepath variable to the function.
Regarding the other error (Error in rng[1, ] : incorrect number of dimensions), I found out that my data (that I had downloaded using a custom script from synergise sentinel-hub) had images with N/A in them, and lots of clouds, since I was trying to use the whole time-series to see what impact clouds could have, and how much the GAM function could smooth the series. So I decided to go through all the data and remove the defective images and also limit the time-series by removing all images with cloud cover>10%. After doing all this, the function could run.
Hi @n-verde ,
I understand that this is an old post, but I faced the same issue you mentioned earlier, but I did not quite get what you meant by "adding the filepath variable to the function". If you still happen to have the above codes around, would you mind explaining that portion to me? I am not very familiar with R myself.
Thanks for the trouble!
@Randy6668
I had changed twdtwApply from the code (see answer 11 Apr. 2019) to:
twdtw_dist = twdtwApplyParallel(x=raster_timeseries, y=temporal_patterns, weight.fun=log_fun,
filepath=exportFolder, format="GTiff", overwrite=TRUE, progress='text')
Hope that helps!
Hi @n-verde ,
I understand that this is an old post, but I faced the same issue you mentioned earlier, but I did not quite get what you meant by "adding the filepath variable to the function". If you still happen to have the above codes around, would you mind explaining that portion to me? I am not very familiar with R myself.
Thanks for the trouble!
@n-verde
That resolved the subscript issue, thanks! Having the same problem with regards to the dimension now, but since I'm pretty sure I do not have any clouds or NA data (it is a custom raster), I'll look for solutions elsewhere. Thanks for the help!
Hi @n-verde, I read your above comments, it is very useful. Currently, I also work with this package. During you run twdtwApply function, did you get the error st like: Error in (function (classes, fdef, mtable) : unable to find an inherited method for function ‘trim’ for signature ‘"list"’ ? if yes, please let me know how to fix? Thank you Looking forward to see your reply Vicky
@Vickyle I'm sorry, as far as I can recall I didn't have such an error... I look like a data dimension issue. Good luck hope you find the way!
Hi @n-verde, It's me again, Vicky. How did last time you deal with the error related to #incorrect number of dimension? (I already remove images that are more than 10% cloud. still got the error, Please let me know, what you did last time. looking forward to see your reply, Thank you very much, Vicky
@Vickyle I removed images that had NAN inside because of cloud masking.
Hey Victor and all, I had the same problem as described in this post and was able to solve it by adding the file path (thanks for your advice @n-verde)! But later, when this step is executed I get an error in the twdtwClassify function: "Error in vv[, i] : incorrect number of dimensions". I am sure I do not have any NAs in my data (I changed it to 0 before). Does anyone know what is causing this error? I would appreciate your help very much! Ursa.
Please install the new version from GitHub remotes::install_github(vwmaus/dtwSat)
and see the example in: https://github.com/vwmaus/dtwSat/blob/89d5db9ff27999ef259258dc6502c1d63da4a1e8/examples/fast_twdtw.R#L53
Dear Victor,
I am trying to run your code for a TWDTW in a sample study area. I'm using Sentinel-2 L2A data, NDVI and NDWI for 144 images throughout a year.
All works fine until I use the function "twdtwApplyParallel". It throws an error:
Error in
[<-
(*tmp*
, rows, , value = d$value$value[[l]]) : subscript out of boundsWhen running with the "twdtwApply" function, again, it fails with the error:
Error in
[<-
(*tmp*
, rows, , value = B[[l]]) : subscript out of boundsDo you have any idea what might be causing this? I suspect that there is something going on with my data, but I can't understand what. When I run the same script with your sample MODIS data it runs smoothly.
I have attached my script (text) along with the data I'm using (link to Google Drive).
Looking forward to your answer.
Thank you in advance,
Natalie
Data: https://drive.google.com/drive/folders/13t2G88J9zRp-KRRWBXI4Z5jvZCSVSBmx?usp=sharing
Code:
This script is used for .....................
remove all variables from workspace
rm(list=ls())
================= SETTINGS ========================================
set strings
set folder which has the spectra txt files
dataFolder <- "D:\DTW_Data\"
================ IMPORTS ==========================================
import nessesary libraries needed for the code
require(dtwSat) require(caret) require(sp) require(tidyverse)
================= MAIN PROGRAM=====================================
print("start")
----------------- Read Data ---------------------------------------
read the data
ndvi = brick(paste(dataFolder,"NDVIs.tif", sep = "")) ndwi = brick(paste(dataFolder,"NDWIs.tif", sep = "")) dates = scan(paste(dataFolder,"timeline", sep = ""), what = "dates")
Products from sensors such as the Sentinels and Landsat, usually have all pixels with
the same date, therefore the argument doy is not needed.
read csv sample file and the projection of the samples
UTF-8 is because samples contain greek characters!
field_samples = read.csv(paste(dataFolder,"samples.csv", sep = ""), encoding = 'UTF-8') field_samples = as.data.frame(field_samples, stringsAsFactors = FALSE)
correct the column names
columnNames = c('longitude', 'latitude', 'from', 'to', 'label') colnames(field_samples) = columnNames
ATTENTION! SAMPLES MUST BE IN THE SAME PROJECTION AS THE IMAGES!!
proj_str = proj4string(ndvi)
proj_str # show the projection for check head(field_samples, 5) # show first 5 lines for check print("Type and number of samples:") table(field_samples[["label"]])
-------------------------------------------------------------------
produce an object with the time series of multiple satellite bands
raster_timeseries = twdtwRaster(ndvi, ndwi, timeline = dates)
-------------------------------------------------------------------
assess the separability of patterns before classifying the large data set -->
first modify the field_samples table to not contain UTF-8 characters in label
label_values = unique(field_samples$label) field_samples_mod = field_samples field_samples_mod$label = as.character(field_samples_mod$label) for (i in 1:length(label_values)) { field_samples_mod$label = gsub(unlist(as.character(label_values[i])), i, field_samples_mod$label) }
a) extract the time series of each field sample from our raster time series
field_samples_ts = getTimeSeries(raster_timeseries, y = field_samples_mod, proj4string = proj_str)
b) perform the cross-validation
set.seed(1)
Function 'twdtwCrossValidate' splits the sample time series into training and validation sets using stratified
sampling with a simple random sampling within each stratum. A Generalized Additive Model (GAM)
generates the smoothed temporal patterns based on the training samples.
i) logistic weight function with steepness 'alpha' and midpoint 'beta', 'beta' is number of days
log_fun = logisticWeight(alpha=-0.1, beta=50)
ii) GAM smoothing formula
GAMformula = y ~ s(x, bs="cc") # function s sets up a spline model, with x the time and y a satellite band
times' = different data partitions, each with 'p' percentage of the samples for training and '1-p' for validation
'freq' = frequency of the temporal patterns in days
cross_validation = twdtwCrossValidate(field_samples_ts, times=10, p=0.1, freq = 8, formula = GAMformula, weight.fun = log_fun)
cross_validation # print cross validation results
--------------------------------------------------------------------
create temporal patterns in order to classify the raster time series -->
i) randomly select p % of our samples for training and keep the remaining for validation
set.seed(1) I = unlist(createDataPartition(field_samples_mod[,"label"], p = 0.5)) training_ts = subset(field_samples_ts, I) validation_samples = field_samples_mod[-I,]
ii) create temporal patterns
temporal_patterns = createPatterns(training_ts, freq = 8, formula = GAMformula) plot(temporal_patterns, type = "patterns") + theme(legend.position = c(.8,.25))
------------------------------------------------------------------
classify the image time series with twdtwApply
tic <- proc.time() ## start clocking global time
(Run parallel TWDTW analysis)
beginCluster() twdtw_dist = twdtwApply(x = raster_timeseries, y = temporal_patterns, weight.fun = log_fun, overwrite=TRUE, format="GTiff") endCluster()
clock global time
toc <- proc.time()-tic[3]
print("TIME:") capture.output(toc)
plot(x = twdtw_dist, type="distance") land_cover_maps = twdtwClassify(twdtw_dist, format="GTiff", overwrite=TRUE)
plot results
plot(x = land_cover_maps, type = "maps") plot(x = land_cover_maps, type = "area") plot(x = land_cover_maps, type = "changes") plot(x = land_cover_maps, type="distance")
------------------------------------------------------------------
assess classification accuracy
maps_assessment = twdtwAssess(land_cover_maps, y = validation_samples, proj4string = proj_str, conf.int=.95)
maps_assessment # print maps assessment results