Open jprante opened 9 years ago
There is a method in Mysql 5.7 for geojson outputing: http://dev.mysql.com/doc/refman/5.7/en/spatial-geojson-functions.html
My question is really about every array insertion (which includes coordinates in Geo_shapes). http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/mapping-array-type.html
Facing the same issue, could not get kibana3 bettermap widget working using "geo_point" example...
I added geo shape support in this commit https://github.com/jprante/elasticsearch-jdbc/commit/2d727f7c05be2e15df7b803afa91735a497730cb
@jprante Thanks for the quick release! Just have time to test this new feature. I am not using the geo shape, I am more interest on able to convert my data into geoJson format and store it into ES (Kibana3 only support this geo point format... ) and the new commit seems able to handle the "POINT(" data.
Here is my setup:
type_mapping:
"coords" :
{
"type" : "geo_point",
"lat_lon" : true,
"geohash" : true
}
I had try make one of my column "coords" return "POINT(X X)"
After few test I keep getting the following error from ES server log, I had try change the "geo_point" mapping to "geo_shape" as well still no luck:
"coords":"POINT(-36.86396623 174.73605846)" org.elasticsearch.index.mapper.MapperParsingException: failed to parse at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:565) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:493) at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:480) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:423) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:149) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:515) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:422) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.elasticsearch.ElasticsearchIllegalArgumentException: the character 'P' is not a valid geohash character at org.elasticsearch.common.geo.GeoHashUtils.decode(GeoHashUtils.java:288) at org.elasticsearch.common.geo.GeoHashUtils.decodeCell(GeoHashUtils.java:319) at org.elasticsearch.common.geo.GeoHashUtils.decode(GeoHashUtils.java:310) at org.elasticsearch.common.geo.GeoPoint.resetFromGeoHash(GeoPoint.java:77) at org.elasticsearch.index.mapper.geo.GeoPointFieldMapper.parsePointFromString(GeoPointFieldMapper.java:549) at org.elasticsearch.index.mapper.geo.GeoPointFieldMapper.parse(GeoPointFieldMapper.java:527) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:706) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:497) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:544)
@jprante I had found my issue using GeometryTest, "POINT(-36.86396623 174.73605846)" is a incorrect geo point value (exception must be throw silently), it should be "POINT(174.73605846 -36.86396623)"... I don't need special type_mapping for this column any more, it creates a new "coordinates" sub type and it is using geoJson format! I am able to use the map widget in kibana3 now!
Thanks for this wonderful plugin:)
@jprante can this feature back ported to 1.5? 1.6 is not using the river API, I can't easily interact from another plugin.
Yes, I can backport this feature. It will take some weeks or so, please be patient.
I'm trying to map a Geoshape in Elasticsearch from Mysql using Elasticsearch-river My shape is stored as a string in Mysql and I can't manage to parse it in elasticsearch.
Would you have some tricks to make it consider the string as an array ? Thanks !
PUT _river/river_places/_meta { "jdbc": { "index": "clicwalk", "type_mapping": { "place": { "properties": { "location": { "type": "geo_point" }, "contour": { "type": "geo_shape", "tree": "quadtree", "precision": "1km" } } } }, "url": "jdbc:mysql://localhost:3306/clicandwalk-dw", "user": "root", "sql": { "statement": "SELECT places.id as \"_id\", places.label, places.address, longitude_city as \"location.lon\", latitude_city as \"location.lat\", places.country, places.state, places.county, places.district, places.city, places.postal_code, geotype_city as \"contour.type\", REPLACE(geoshape_city, '\"', '') as \"contour.coordinates\" FROM places;" }, "password": "poc", "type": "place" }, "type": "jdbc" }
[2014-12-16 09:27:03,114][DEBUG][action.bulk ] [Paladin] [clicwalk][3] failed to execute bulk item (index) index {[clicwalk][place][507], source[{"label":"CARREFOUR INEXISTANT SAINTES","address":"Cours du Maréchal Leclerc","location":{"lon":-0.6333330274,"lat":45.75},"country":"France","state":"Poitou-Charentes","county":"Charente-Maritime","district":"Arrondissement de Saintes","city":"Saintes","postal_code":"17100","contour":{"type":"polygon", "coordinates":"[[[-0.649495377401502,45.707967516585917],[-0.658706806737811,45.709895241043242],[-0.670523812123921,45.704156077835343],[-0.692796248384822,45.708800705802034],[-0.697123220623806,45.707315551734041],[-0.701962923100085,45.713106314012173],[-0.701247575676377,45.723370595647701],[-0.6856774192697,45.731137896417387],[-0.685787806272521,45.738003479921964],[-0.688014144753198,45.741076768830702],[-0.701627373519283,45.73730397101135],[-0.70409140639864,45.74026098469637],[-0.706766073801605,45.745969557078126],[-0.701830617398765,45.749945032791196],[-0.668443910686557,45.750369558110059],[-0.659025028161417,45.765829202055102],[-0.672376719231412,45.775437387778844],[-0.674406791973133,45.782132243442859],[-0.65813387213166,45.787375886797179],[-0.647472871704148,45.789055311317284],[-0.643190987946822,45.782748339993894],[-0.62588599040071,45.776049445129516],[-0.620994975215616,45.77022106697865],[-0.623852387004477,45.763711027852096],[-0.620632421223112,45.761155246723192],[-0.61580851246593,45.761634586460858],[-0.605699674400596,45.754548617747048],[-0.587941457719729,45.750373797000677],[-0.586819450993255,45.740430814536225],[-0.599227052653651,45.736041912142852],[-0.604097401093381,45.730189231493632],[-0.617149419259307,45.728031206494094],[-0.634126473308095,45.713548601569165],[-0.649495377401502,45.707967516585917]]]" }}]} org.elasticsearch.index.mapper.MapperParsingException: failed to parse [contour] at org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper.parse(GeoShapeFieldMapper.java:249) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeObject(ObjectMapper.java:549) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:491) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:534) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:483) at org.elasticsearch.index.shard.service.InternalIndexShard.prepareIndex(InternalIndexShard.java:397) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:421) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:158) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:527) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:426) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.elasticsearch.common.jackson.core.JsonParseException: Current token (END_OBJECT) not numeric, can not use numeric value accessors at [Source: [B@6d660704; line: 1, column: 1677]