Open willmorrison1 opened 4 years ago
On the imageRes, the higher the better.
The rest, depending on the quality in the Google Earth and the application the results will be used for.
Thanks Dimitris. Agreed. The above is a brain dump and work in progress evaluation!
I am hoping to get away with lower res, else I anticipate requiring a lot of storage space. Given the limited detail of the GE model, I will try out all of these permutations across a complex high rise area and evaluate using the highest resolution result.
Shall we compare GE results againts an rgb drone-based result, if that is possible?
Sounds like a very good idea. I was just looking to see if Heraklion has GE 3D data and it does, so this comparison is possible. Can you send a shape file or some coordinates of the drone data spatial extent?
I will prepare and send soon. Can you send a reminder over mail on Thursday?
updated advised configuration for tough areas e.g. dense high rise https://github.com/kitbenjamin/googleEarthImageCollection/commit/88ce8c2533ce92cf5562eba9e58ac66fd18adeb6. benchmarking will be quite qualitative (no time for in-depth analysis at the moment)
qualitative results: with
9 100 TRUE 5 3800
the benchmark (smallest interval
length, highest img res, most images per interval
)
10 150 TRUE 5 3800
is a reasonable image sample density (interval
length of 150 m vs 100 m, 5 images per interval
)
image collection time/storage space requirement back of envelope calculations turned into a quick script:
for reference/update later:
library(dplyr)
library(cowplot)
library(QOLfunctions)
dfOut <- expand.grid(regionXY_m = seq(10000, 15000, by = 5),
interval_m = seq(150, 250, by = 1),
imagePerinterval_m = 5)
maxImg <- 100000
minImg <- 1000
dfOut$totalImg <- with(dfOut, (ceiling(regionXY_m / interval_m)^2) * imagePerinterval_m)
dfOut$totalImg <- mround(dfOut$totalImg, 5000)
dfOut <- dfOut %>%
dplyr::filter(between(totalImg, minImg, maxImg))
#approx 1000 images per hour
timePerImg_s <- (60*60) / 1000
#approx 11GB per 1000 images
spacePerImg <- 12
basePlot <- ggplot(dfOut, aes(regionXY_m, interval_m)) +
# scale_fill_distiller(palette = "Spectral") +
xlab("Square region length (m)") +
ylab("Image interval (m)") +
theme_bw() +
scale_x_continuous(guide = guide_axis(check.overlap = TRUE))
imageNo <- basePlot +
geom_raster(aes(fill = as.factor(totalImg))) +
labs(fill = "n#") +
ggtitle("Number of images")
imageTime <- basePlot +
geom_raster(aes(fill = as.factor((totalImg * timePerImg_s) / 60 / 60))) +
labs(fill = "days") +
ggtitle("Colletion time")
imageSpace <- basePlot +
geom_raster(aes(fill = as.factor((totalImg * spacePerImg) / 1000 / 1000))) +
labs(fill = "TB") +
ggtitle("Storage space required")
pltOut <- cowplot::plot_grid(imageNo, imageTime, imageSpace, ncol = 1, align = "hv")
ggsave(filename = "C:/Users/micromet/Desktop/regionStats.png",
plot = pltOut, height = 7, width = 4, dpi = 600)
tried 200 m image interval and is too coarse. 150 m is recommended image interval so long as it remains a constant
large area image collection (9 x 9 km) stats
Summary:
dir <- "D:/GE/ImageCollection/regions/Central_London/googleEarthOut"
allFiles <- list.files(dir, recursive = TRUE, pattern = "movie", full.names = TRUE)
allFiles_info <- file.info(allFiles)
totalTime_s <- as.numeric(max(allFiles_info$ctime) - min(allFiles_info$ctime)) * 24 * 60 * 60
print(totalTime_s)
#> [1] 126902.5
plot(diff(sort(allFiles_info$atime)), ylim = c(0, 3), pch = 20, cex = 0.1)
abline(h = median(diff(sort(allFiles_info$atime))), col = "red")
Created on 2020-06-08 by the reprex package (v0.3.0)
What tour parameters are best? How many images and across what interval? What resolution? row # is the region ID, rows are permuted parameters
intervalSize downwardImage nSamplesAroundOrigin imageRes
[x] 1 100 FALSE 4 2000
[x] 2 150 FALSE 4 2000
[x] 3 100 TRUE 5 2000
[x] 4 150 TRUE 5 2000
[x] 5 100 FALSE 5 2000
[x] 6 150 FALSE 5 2000
[x] 7 100 FALSE 4 3800
[x] 8 150 FALSE 4 3800
[x] 9 100 TRUE 5 3800
[x] 10 150 TRUE 5 3800
[x] 11 100 FALSE 5 3800
[x] 12 150 FALSE 5 3800