Closed ypasmk closed 8 years ago
You can override any setting of Graylog with a custom Elasticsearch configuration file. The relevant setting in graylog2.conf
is elasticsearch_config_file
which should point to your Elasticsearch configuration (YAML) file for Graylog.
https://github.com/Graylog2/graylog2-server/blob/1.0.0-beta.2/misc/graylog2.conf#L82-84
After clarification on IRC: The index mapping is hard-coded in https://github.com/Graylog2/graylog2-server/blob/1.0/graylog2-server/src/main/java/org/graylog2/indexer/Mapping.java and cannot be overridden via elasticsearch_config_file
. Other settings, though, can be overridden within the custom Elasticsearch configuration.
Would it be possible to add a converter that allowed you to choose what type of mapping you wanted? I.e. convert this field to double/float/integer/etc.
+1 This sounds quite useful to me and we actually have a use case for this as well. In addition, it would make sense to be able to "pre-initialise" an elasticsearch index with columns of specific type via some plugin every time an index is cycled. For example I would like to initialise each index with a column "cpu" of type "number" to make sure noone accidentally adds a "cpu" column of type text. (hope the example is clear, otherwise pls tell)
+1 I too would like this ability. I use Graylog in conjunction with Kibana for visualizations (GeoIP/trends/etc), and it would be nice to set up custom mappings. I know that it is possible to use the Number converter to generate fields of type "double", but it would be great if we could generate other types as well.
+1
:+1:
one of my colleagues found a way on how to initialise an index directly via ElasticSearch. This is just a mitigation for the actual problem but currently the best option we found.
In ElasticSearch you can create an index mapping template that would have to match graylog_*
and in there you could define some mappings. See Index Templates for further information.
In ES 1.5 and graylog 1.0.2-1
I wan't to reopen this because graylog2 seem to add a geo_point value with
"[105.0,35.0]" literal string and make ES can't convert "[105.0" as a number
a possible workarround with logstash is mutate like
mutate {
replace => [ "geoip_location", "%{[geoip][latitude]},%{[geoip][longitude]}" ]
}
From ES documentation this filed type accept the following value: https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-geo-point-type.html#_mapping_options
Error from graylog2 log [2]: index [graylog2_0], type [message], id [eae998b2-0dd9-11e5-b586-005056a37480], message [MapperParsingException[failed to parse]; nested: NumberFormatException[For input string: "[-74.0628"]; ]
Error from ES log [2015-06-08 09:55:05,724][DEBUG][action.bulk ] [es-silog-back1] [graylog2_0][2] failed to execute bulk item (index) index {[graylog2_0][message][6ffeaa61-0dc4-11e5-b5f8-005056a37806], source[{"geoip_ip":"186.115.218.34","gl2_source_node":"333be6fb-0df6-45a0-ab48-28642e89e92f","direction":"o","geoip_continent_code":"SA","geoip_region_name":"34","geoip_location":"[-74.0628,4.6492000000000075]","type":"syslog","version":"1.0","remote_host":"186.115.218.34","transfert_type":"b","timestamp":"2015-06-04 13:44:03.000","username":"login","level":6,"_id":"6ffeaa61-0dc4-11e5-b5f8-005056a37806","facility":"gelf-rb","gl2_source_input":"55561048e4b0d196d5d223a8","geoip_city_name":"Bogotá","geoip_latitude":4.6492000000000075,"authenticated_id":"*","geoip_timezone":"America/Bogota","tags":"silog-front1, silog-mid2, ftpsedrXferlog","full_message":"186.115.218.34: /XXXXXXXXxs/msla/all-sat-merged/h/1993/dt_global_allsat_msla_h_19930102_20140106.nc.gz","transfer_time":"18","completion_status":"c","access_mode":"r","message":"186.115.218.34: /doXXXXXXXXXXsla/all-sat-mergeXXXXXXXXXsat_msla_h_19930102_20140106.nc.gz","geoip_longitude":-74.0628,"filesize":"2015106","source":"majsedrPBchroot","geoip_real_region_name":"Distrito Especial","filename":"/donneeXXXXXXXXXXsla/all-sat-mergXXXXXXXXXX106.nc.gz","streams":[],"geoip_country_name":"Colombia","geoip_country_code3":"COL","geoip_country_code2":"CO"}]} org.elasticsearch.index.mapper.MapperParsingException: failed to parse at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:565) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:493) at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:480) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:423) at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:149) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:515) at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:422) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.NumberFormatException: For input string: "[-74.0628" at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1250) at java.lang.Double.parseDouble(Double.java:540) at org.elasticsearch.common.geo.GeoPoint.resetFromString(GeoPoint.java:68) at org.elasticsearch.index.mapper.geo.GeoPointFieldMapper.parsePointFromString(GeoPointFieldMapper.java:551) at org.elasticsearch.index.mapper.geo.GeoPointFieldMapper.parse(GeoPointFieldMapper.java:527) at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:706) at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:497) at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:544) ... 9 more
Maybe related with the feature ticket I created, #1469
You can already use a Copy Extractor with numeric converter, for a string representation of a number, but this does not cover all the cases, like "null" as a string or empty string.
Since Graylog 2.0, the server installs a regular index template with a low priority into Elasticsearch. You can override any setting but adding a custom index template into Elasticsearch which has a higher priority.
http://docs.graylog.org/en/2.1/pages/configuration/elasticsearch.html#custom-index-mappings
Closing this now.
Currently the default settings and mapping of the ES index is hardcoded in the application. It would be nice to be able to overwrite this behaviour (i.e having a folder that you can put your settings to be populated to the newly created index).