Open frederikvand opened 4 years ago
@ranjanan do we use Int16 by any chance?
@ViralBShah I can't seem to reproduce the problem on a smaller scale because it only occurs when more than 37k points are used. It is thus not caused by the values them self as it works with small subsets with values of 50k for example. The only solution at the moment is to cut all my scenarios in separate parts in such a way that it doesn't exceed 37 k points and feed the points in chunks. Furthermore, circuitscape can also not handle rasters with NA values that are are lower than Int16. Do you want me to link the whole scenario?
Circuitscape should be able to handle rasters with any NoData value as long as it's supported by the raster's type. Where is the error message that is suggesting that it is breaking? Could you post a link to a copy of the offending file so I can troubleshoot?
Dear @vlandau,
Error dense matrix construction related to n points can only be reproduced on the complete scale I'm afraid. I'm sorry for that. It takes circa 25 min to reproduce the problem. Files can be found in the link below. This error takes effect on the latest circuitscape version and on the master.
The InexactError: Int64(-3.4e38) seems to be solved on the new master 5.5.6. This update just went in effect and the error dissapeared.
Thank you very much to have look!
Yes, there was a commit on June 9th that addressed some type issues related to tif reading. So the NoData issue you mentioned is solved now?
@vlandau yup! The dense matrix construction keeps failing on the new master version though.
Okay good to hear at least one problem is solved! It looks like this other error could be caused by a bug or limitation of the SuiteSparse package. Might be best to let @ranjanan chime in here when he gets a chance.
Separating each raster in smaller parts and using meta-parallelisation seem to be a temporary workaround but its a lot of work for all my scenarios.
The inexact error on the master appears when tiff point files are not written as INT4S (i.e. in the float default).
ERROR: InexactError: Int64(-3.4e38) Stacktrace: [1] Int64 at ./float.jl:710 [inlined] [2] convert(::Type{Int64}, ::Float32) at ./number.jl:7 [3] read_raster(::String, ::Type{T} where T) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:469 [4] _grid_reader(::Type{T} where T, ::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:93 [5] read_polymap(::Type{T} where T, ::String, ::Circuitscape.RasterMeta; nodata_as::Int64, resample::Bool) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:139 [6] read_polymap at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:139 [inlined] [7] read_point_map(::Type{T} where T, ::String, ::Circuitscape.RasterMeta) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:170 [8] load_raster_data(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/io.jl:403 [9] raster_pairwise(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/raster/pairwise.jl:18 [10] _compute(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/run.jl:42 [11] macro expansion at ./util.jl:234 [inlined] [12] compute(::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/Gyvjc/src/run.jl:31
@frederikvand sorry that this issue has gone unaddressed for so long. I believe we have changed the IO methods since this issue was posted. Do you know if this issue is still occurring?
Dear @vlandau,
On a short time period I won't have time to test if the problem was resolved but I will notice you directly when I can. The paper using this method to predict migration routes for > 50k patches for many scenarios is currently accepted with revisions and I do think that there might be more requests to model on this scale in the future. Not having to separate each raster scenario in multiple parts would make this technique a lot more feasible. Especially because access to supercomputing is improving! However, even for scenarios with less than 37 k patches meta-parallelization sped up the process a lot compared to the internal parallelization (36 cores). After 200 hours the process with internal parallelization was not finished on 1 scenario, compared to 9 hours with meta-parallelization. This is probably because increasing cell counts increase calculation duration exponentially and not linear. This was the same for the amount of points, so meta-parallelization was faster with 500 points (with a chunk size of 100) compared to the internal parallelization.
With kind regards, Frederik Van Daele
Dear circuitscape associates,
This error keeps returning in random occasions and I think it happens only when point rasters have more than 32,767 values. I always use 64bit_indexing = true in the ini setup.
[ Info: 2020-06-16 19:54:46 : Logs will recorded to file: log_file [ Info: 2020-06-16 19:54:47 : Precision used: Double [ Info: 2020-06-16 19:54:51 : Reading maps [ Info: 2020-06-16 19:55:32 : Resistance/Conductance map has 90781250 nodes [ Info: 2020-06-16 19:56:56 : Solver used: CHOLMOD [ Info: 2020-06-16 19:56:56 : Graph has 90781250 nodes, 151 focal points and 1 connected components [ Info: 2020-06-16 19:57:04 : Total number of pair solves = 11325 [ Info: 2020-06-16 20:06:01 : Time taken to construct cholesky factor = 520.0718625 [ Info: 2020-06-16 20:06:06 : Time taken to construct local nodemap = 5.278284888 seconds [ Info: 2020-06-16 20:18:11 : Solving points 1 to 100
ERROR: LoadError: ArgumentError: dense matrix construction failed for unknown reasons. Please submit a bug report. Stacktrace: [1] SuiteSparse.CHOLMOD.Dense{Float64}(::Ptr{SuiteSparse.CHOLMOD.C_Dense{Float64}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:222 [2] Dense at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:238 [inlined] [3] allocate_dense(::Int64, ::Int64, ::Int64, ::Type{Float64}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:414 [4] SuiteSparse.CHOLMOD.Dense{Float64}(::Array{Float64,2}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:810 [5] (::SuiteSparse.CHOLMOD.Factor{Float64}, ::Array{Float64,2}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/SuiteSparse/src/cholmod.jl:1710 [6] _cholmod_solver_path(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool, ::Int64) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:414 [7] single_ground_all_pairs(::Circuitscape.GraphData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}, ::Bool) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:59 [8] single_ground_all_pairs at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/core.jl:53 [inlined] [9] _pt_file_no_polygons_path(::Circuitscape.RasData{Float64,Int64}, ::Circuitscape.RasterFlags, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/raster/pairwise.jl:61 [10] raster_pairwise(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/raster/pairwise.jl:29 [11] _compute(::Type{T} where T, ::Type{T} where T, ::Dict{String,String}) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/run.jl:42 [12] macro expansion at ./util.jl:234 [inlined] [13] compute(::String) at /data/leuven/330/vsc33060/Julia/packages/Circuitscape/ZB2kf/src/run.jl:31 [14] top-level scope at /data/leuven/330/vsc33060/input/Julia_scripts/patches_eu/03_try_400/01_current_100.jl:5 [15] include(::Module, ::String) at ./Base.jl:377 [16] exec_options(::Base.JLOptions) at ./client.jl:288 [17] _start() at ./client.jl:484 in expression starting at /data/leuven/330/vsc33060/input/Julia_scripts/patches_eu/03_try_400/01_current_100.jl:5
With kind regards, Frederik Van Daele