fluent / fluent-plugin-mongo

MongoDB input and output plugin for Fluentd
https://docs.fluentd.org/output/mongo
173 stars 61 forks source link

Input plugin not picking up collection data #127

Open ChristiaanWestgeest opened 5 years ago

ChristiaanWestgeest commented 5 years ago

I'm running a combination of MongoDb, Elasticsearch and FluentD on an Azure Kubernetes cluster. The intention is to sync data from the MongoDb to ElasticSearch so it is available for quick search / analysis.

I've got things up and running up to the point where the plugin makes a connection with the MongoDb. But it does not pick up any records from the database.

I can confirm that if I change the username, password or database name in my ConfigMap, the fluentD pod log shows an 'unauthorized for database X' message. If I use the correct connection string, the logs show nothing. This makes me think initial connection is established properly.

I've verified that the ElasticSearch plugin works properly using a simple @type tail as input <source>. I can also confirm that the collection I'm using exists and contains data in the given MongoDb.

So I seem to have a working connection, but nothing happens. And that's where I hit a wall. I suspect the plugin either can't find the collection, or can't read from it, but without any kind of log output I'm not sure what's happening.

I'm not suggesting this is an issue with the plugin itself, but if anyone has any advice on what additional steps I can take to find out what the issue is and/or can spot a fault in my configmap (below), I'd be most grateful.

    <source>
      @type mongo_tail
      url mongodb://{user}:{password}@{clusterIp}:{port}/{databasename}
      collection {collectionName} 
      tag app.mongo_esreader
      wait_time 5
      object_id_keys ["id_key"]
      @log_level trace
    </source>

    <match app.**>
      @id elasticsearch
      @type elasticsearch
      index_name mongo
      host elasticsearch-logging
      port 9200
       <buffer>
        flush_thread_count 8
        flush_interval 5s
        chunk_limit_size 2M
        queue_limit_length 32
        retry_max_interval 30
       </buffer> 
       @log_level debug
    </match>