Open krober10nd opened 3 years ago
just to note, I tried changing the hardcoded AVAILABLE_MEMORY=4 GB size in geodata.m to 100 GB on our big-memory workstation but was still getting downsampling stride messages. Maybe that was just because the DEM is much finer resolution than the minimum element size; is it supposed to be like that?
yes. If the DEM is much finer than the minimum element size, then there's little point in reading in everything. I guess we could put some more advanced up and down sampling methods.
Just to note: I would strongly recommend not interpolating from the geodata
object and instead using the original NetCDF file with msh.interp
.
Building a mesh from a geodata object is fine, but interpolation is more critical to get right and sensitive to downsampling.
geodata
may have been downsampled to fit in RAM and it's best to probably just interpolate directly from the DEM.