Closed simon-tarr closed 4 years ago
How much time for the python version?
Good question - I haven't timed it using the old Python version. I can try doing that now. I'd be curious if others can replicate the "slow" processing with the files attached above, however.
Any update on how much time the Python version takes?
I've noticed that only one core of my machine is utilised. Is there a way to force it to use all 16 cores simultaneously?
@simon-tarr Add the below code to your .ini file to utilize parallel processing. There has been an issue with SharedArrays (which Circuitscape's parallel processing relies on) in Windows, so be wary of that.
parallelize = True
max_parallel = 15
Note that parallelization requires more memory usage, but I think it should be fine given your 32GB of RAM and the relatively small size of your resistance grid. If memory does become a problem, trying using single precision (add precision = single
to your .ini file).
I am tempted to close this one and reopen if slower than the earlier python code.
More than likely that I'm doing something wrong but I feel that the processing time for my Circuitscape run is very slow. I'm running Win10, 16 core Xeon and 32GB RAM. It's a reasonably large raster but not excessively so (and definitely smaller than the test runs documented in the recent pre-print).
Here's a snippet out of the Julia console's output:
The time printed to console says that it takes about 15 seconds to compute the first point. However, it takes many times this amount before proceeding to the next point. More like 45 seconds. Whether 15 seconds or 45 seconds, I have to solve 224,115 pairs....which is about 38 days of processing at the current rate of 1 point per 15 seconds or so!
I have uploaded my .ini file, resistance map and habitat raster here as I'm wondering if I have mis-configured my .ini file, or if there is some other issue with my input files? I've noticed that only one core of my machine is utilised. Is there a way to force it to use all 16 cores simultaneously?
Thanks in advance.