Open slibby opened 7 years ago
at a minimum, we can (and should) be using the maxAllowableOffset
query parameter that our query operation exposes to keep vertex counts reasonable.
@jgravois can you clarify maxAllowableOffset
parameter for me? Assuming that we are querying in 4326 and units are degrees of latitude/longitude, would I set it to 0.000005 or so?
that sounds close enough for jazz. in esri-leaflet we let folks supply their own simplification factor and calculate it dynamically as folks zoom in closer.
simplify: function (map, factor) {
var mapWidth = Math.abs(map.getBounds().getWest() - map.getBounds().getEast());
this.params.maxAllowableOffset = (mapWidth / map.getSize().y) * factor;
return this;
},
I have heard of one approach to this: To do a simplification with a really small simplification value. e.g. do a Douglas–Peucker simplication with a parameter of 1cm. Or use the OSM accuracy value. This shouldn't change the geometry much/at all, and I think it will remove extra nodes. The advantage of this approach is that you can run it on every geometry you get from the WFS source. Geometries that aren't overnoded will be returned unchanged, overnoded ones will be fixed up. No special cases to think about!
This is just from the top of my head, you'll have to double check
For Polygon and Polyline features, we should check to make sure the feature creator (on the ArcGIS Side) hasn't created it with thousands of vertices where a half-dozen would do. Maybe there is functionality in D3 or the existing iD code for this, but there may need to be some math designed for this to assess what "too detailed" actually means.
ref: https://github.com/openstreetmap/iD/issues/4164#issuecomment-316753170