Open lvalnegri opened 3 months ago
UPDATE: nope, even dissolving partially filtered subsets does not work. There are five values in that column, two works out, one gives the limit error, the other two give more trouble, with this explicit error:
> rmapshaper::ms_dissolve(y |> collapse::fsubset(livello == 'P1'), sys = TRUE, sys_mem = 64)
Allocating 64 GB of heap memory
[i] Snapped 6478 points
[dissolve] Dissolved 166 features into 1 feature
RangeError: Invalid string length
at stringify (<anonymous>)
at /usr/local/lib/node_modules/mapshaper/mapshaper.js:17474:15
at Array.reduce (<anonymous>)
at exportLayerAsGeoJSON (/usr/local/lib/node_modules/mapshaper/mapshaper.js:17456:41)
at /usr/local/lib/node_modules/mapshaper/mapshaper.js:17555:19
at Array.reduce (<anonymous>)
at exportDatasetAsGeoJSON (/usr/local/lib/node_modules/mapshaper/mapshaper.js:17554:25)
at /usr/local/lib/node_modules/mapshaper/mapshaper.js:17430:18
at Array.map (<anonymous>)
at exportGeoJSON (/usr/local/lib/node_modules/mapshaper/mapshaper.js:17425:24)
Error: Invalid JSON
I'll try diss/binding feature by feature...
Can you try directly on the source file with command-line mapshaper
?
e.g.
$ mapshaper y.geojson -dissolve livello -o output.geojson
The initial error is (I think) due to the inability of R to serialize a single string that long. If we could write ndjson it might help, but don't currently support that.
The second error looks like the limit of mapshaper to export a single very long geometry. This issue might give you some pointers
Can you try directly on the source file with command-line
mapshaper
?e.g.
$ mapshaper y.geojson -dissolve livello -o output.geojson
Hi, I can't actually create the json object from sf, same error as above, and even after stripping digits down to 6. I'll have a look tomorrow at your link below, but I think I need to deal with it in a different way. Thanks for the feedback, anyway.
Do you have the original source file (shp, gpkg, geojson, etc), or only as a .rds
?
Hi all,
While trying to dissolve an
sf
polygons object grouping by one of its column:I received this error:
I've found similar reports about data.table, and looks like the problem is the R limit to read/write text files more than 2GB in size ? the object is actually 1,846,143,660B when saved as
rds
(that's the reason I don't post/link2 it), so possibly lots more in jsonI reckon the only quick solution ATM is to rbinding partial dissolutions, but is there a more elegant fix?
Thanks, Luca