Closed VirginiaMorera closed 4 years ago
Hi @VirginiaMorera , yes, that is pretty normal behavior right now. I was recently given a suggestion for updating the algorithm to make it go much faster, but haven't made the time to implement it yet. What's your timeline for needing these thinned data?
Hi, thanks for the answer!
I was just testing the method to compare it to other methods of thinning and subsetting data I've been using so far, but I don't have a specific timeline in mind. However, the results I got from smaller datasets do seem interesting, so I'd be happy to provide test runs with my data for the improved algorithm once you have it implemented!
For more info, my pattern was fairly clustered in some areas, so lots of points in the 10 km radius I had set as parameter for the thinning, which I imagine would increase the running time. I tried it again with the same dataset but setting 1 km as thinning radius and still had it running for more than 48 hours before I gave up and stopped it.
Hello,
I am trying to run the thin function in a dataset with ~ 15.000 data points. I expected the run time to grow exponentially as the number of points increased, but this:
thin( loc.data = V, lat.col = "Latitude", long.col = "Longitude", spec.col = "Colony", thin.par = 10, reps = 100, locs.thinned.list.return = TRUE, write.files = T, max.files = 5, out.dir = "spthin_test/", out.base = "V_thinned", write.log.file = TRUE, log.file = "V_spThin_log_file.txt", verbose = T)
Where "V" is the dataset with ~15.000 positions, and Colony contains just 1 level, has been running for 14 hours and hadn't finished when I force-stopped it.
I am running this in an i7 machine with 16 Gb of RAM
Is this normal behaviour, or is something weird happening?
Thanks!