geodocker / geodocker-geomesa

Containers for GeoMesa enable Accumulo
70 stars 40 forks source link

WFS Support #9

Closed arenger closed 6 years ago

arenger commented 6 years ago

Hello Again,

This is a helpful plugin, and I've got it work fine with WMS. However, I'm curious: Was this plugin/extension meant to also work WFS? As a test, I loaded a dataset in three different ways: in a postgis instance, in elasticsearch, and a shapefile uploaded to the GeoServer... all the same data, served via an instance of GeoServer 2.12.0 running in Docker. I then issued six different queries, all with the same bounding box:

1) WMS to the postgis layer 2) WFS to the postgis layer 3) WMS to the shapefile layer 4) WFS to the shapefile layer 5) WMS to the elasticsearch layer 6) WFS to the elasticsearch layer

The first five queries return without error and all return the same feature count. The 6th query fails with the following error.

geoserver_1  | 21 Dec 16:38:27 ERROR [data.elasticsearch] - method [POST], host [http://es:9200], URI [/mvum/FeatureCollection/_search], status line [HTTP/1.1 500 Internal Server Error]
...
geoserver_1  | Caused by: java.io.IOException: Error executing query search
geoserver_1  |  at mil.nga.giat.data.elasticsearch.ElasticFeatureSource.getReaderInternal(ElasticFeatureSource.java:132)
geoserver_1  |  at org.geotools.data.store.ContentFeatureSource.getReader(ContentFeatureSource.java:647)
geoserver_1  |  at org.geotools.data.store.ContentFeatureCollection.features(ContentFeatureCollection.java:173)
geoserver_1  |  ... 114 more
geoserver_1  | Caused by: org.elasticsearch.client.ResponseException: method [POST], host [http://es:9200], URI [/mvum/FeatureCollection/_search], status line [HTTP/1.1 500 Internal Server Error]
geoserver_1  | {"error":{"root_cause":[{"type":"query_phase_execution_exception","reason":"Result window is too large, from + size must be less than or equal to: [10000] but was [1000000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"mvum","node":"ve0jbu0DTiK1Iv256fIY-A","reason":{"type":"query_phase_execution_exception","reason":"Result window is too large, from + size must be less than or equal to: [10000] but was [1000000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."}}]},"status":500}

I tried shrinking the bounding box, such that only two features are returned... but the same error occurs. Do you have any suggestions?

arenger commented 6 years ago

Please excuse me, I posted this issue to the wrong github project (wrong tab in my browser)... that's embarrassing. Carry on, folks. Nothing to see here...