Also, I can't manage to see river-mongodb logs into the elastic search logs,
my logging.yml is as the following:
# you can override this using by setting a system property, for example -Des.logger.level=DEBUG
es.logger.level: INFO
rootLogger: ${es.logger.level}, console, file
logger:
# log action execution errors for easier debugging
action: DEBUG
# reduce the logging for aws, too much is logged under the default INFO
com.amazonaws: WARN
es.logger.level: INFO
rootLogger: ${es.logger.level}, console, file
# gateway
gateway: DEBUG
index.gateway: DEBUG
# peer shard recovery
indices.recovery: DEBUG
# discovery
discovery: TRACE
rest.action: TRACE
index.search.slowlog: TRACE, index_search_slow_log_file
index.indexing.slowlog: TRACE, index_indexing_slow_log_file
river.mongodb: TRACE
rest.action: TRACE
org.elasticsearch.river.mongodb: TRACE
index.search.slowlog.threshold.query.warn : "10s"
index.search.slowlog.threshold.fetch.debug: "500ms"
index.indexing.slowlog.threshold.index.info: "5s"
additivity:
index.search.slowlog: true
index.indexing.slowlog: true
appender:
console:
type: console
layout:
type: consolePattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
index_search_slow_log_file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}_index_search_slowlog.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
index_indexing_slow_log_file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
I don't know if I should add any lines in order to get logs.
I have tried to create the index in elastic search using:
curl -XPUT localhost:9200/mongo/newperson/_meta -d '{
"type": "mongodb",
"mongodb": {
"servers": [
{ "host": "pc-4372", "port": 27017 }
],
"db": "newPerson",
"collection": "Person",
"options": { "secondary_read_preference": true },
"gridfs": false
},
"index": {
"name": "mongoIndex",
"type": "Person"
}
}'
but although the collection contains more than 1000 document, the index doesn't return anything but only one document and totally non-related to the collection data.
I have tried several times to upgrade and downgrade the components according to the compatibility matrix, but without any progress.
Despite many tries, I can't manage to get data replicated into elastic search or river-mongodb to work on my local machine (Windows 7 64 bit)
I'm using
Also, I can't manage to see river-mongodb logs into the elastic search logs,
my logging.yml is as the following:
I don't know if I should add any lines in order to get logs.
I have tried to create the index in elastic search using: curl -XPUT localhost:9200/mongo/newperson/_meta -d '{ "type": "mongodb", "mongodb": { "servers": [ { "host": "pc-4372", "port": 27017 } ], "db": "newPerson", "collection": "Person", "options": { "secondary_read_preference": true }, "gridfs": false }, "index": { "name": "mongoIndex", "type": "Person" } }'
but although the collection contains more than 1000 document, the index doesn't return anything but only one document and totally non-related to the collection data.
I have tried several times to upgrade and downgrade the components according to the compatibility matrix, but without any progress.
Any suggestions?
Thanks in advance