phutchins / logstash-input-mongodb

MongoDB input plugin for Logstash
Other
187 stars 104 forks source link

Input from Mongodb to Elasticsearch #69

Closed manojsb closed 7 years ago

manojsb commented 7 years ago

I am trying to get mongodb data into Elasticssearch using logstash, with logstash-input-mongodb plugin. I tried to update the plugin to support version 5 of ES.

In terminal it shows version 0.4.1 but in error logs it throws error for version 0.1.X.

Also it's not connecting with mongodb to get data in to ES.

Can someone help with this ?

Error log

[2017-07-10T12:50:31,447][INFO ][logstash.inputs.mongodb  ] Using version 0.1.x input plugin 'mongodb'. This plugin isn't well supported by the community and likely has no maintainer.
[2017-07-10T12:50:32,061][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2017-07-10T12:50:32,062][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2017-07-10T12:50:32,225][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x4e62230d>}
[2017-07-10T12:50:32,227][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-07-10T12:50:32,289][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-07-10T12:50:32,299][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x38dfd049>]}
[2017-07-10T12:50:32,302][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-07-10T12:50:32,644][INFO ][logstash.inputs.mongodb  ] Registering MongoDB input
[2017-07-10T12:50:32,867][ERROR][logstash.pipeline        ] Error registering plugin {:plugin=>"<LogStash::Inputs::MongoDB uri=>\"mongodb://127.0.0.1:27017/jtrade\", placeholder_db_dir=>\"/home/jtrade/logstash\", placeholder_db_name=>\"logstash_sqlite.db\", collection=>\"activities\", batch_size=>5000, id=>\"961874444def3510514cefaccdbeb8ecd95670de-1\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_03e61419-953b-4267-803b-f8550e160cf7\", enable_metric=>true, charset=>\"UTF-8\">, since_table=>\"logstash_since\", since_column=>\"_id\", since_type=>\"id\", parse_method=>\"flatten\", isodate=>false, retry_delay=>3, generateId=>false, unpack_mongo_id=>false, message=>\"Default message...\", interval=>1>", :error=>"Java::JavaSql::SQLException: path to '/home/jtrade/logstash/logstash_sqlite.db': '/home/jtrade' does not exist"}
[2017-07-10T12:50:33,239][ERROR][logstash.agent           ] Pipeline aborted due to error {:exception=>#<Sequel::DatabaseConnectionError: Java::JavaSql::SQLException: path to '/home/jtrade/logstash/logstash_sqlite.db': '/home/jtrade' does not exist>, :backtrace=>["org.sqlite.core.CoreConnection.open(org/sqlite/core/CoreConnection.java:190)", "org.sqlite.core.CoreConnection.<init>(org/sqlite/core/CoreConnection.java:74)", "org.sqlite.jdbc3.JDBC3Connection.<init>(org/sqlite/jdbc3/JDBC3Connection.java:24)", "org.sqlite.jdbc4.JDBC4Connection.<init>(org/sqlite/jdbc4/JDBC4Connection.java:23)", "org.sqlite.SQLiteConnection.<init>(org/sqlite/SQLiteConnection.java:45)", "org.sqlite.JDBC.createConnection(org/sqlite/JDBC.java:114)", "org.sqlite.JDBC.connect(org/sqlite/JDBC.java:88)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "RUBY.connect(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/adapters/jdbc.rb:226)", "RUBY.make_new(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool.rb:116)", "RUBY.make_new(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:228)", "RUBY.available(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:201)", "RUBY._acquire(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:137)", "RUBY.acquire(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:151)", "RUBY.sync(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:282)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:149)", "RUBY.sync(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:282)", "RUBY.acquire(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:150)", "RUBY.hold(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/connection_pool/threaded.rb:106)", "RUBY.synchronize(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/database/connecting.rb:306)", "RUBY.execute(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/adapters/jdbc.rb:251)", "RUBY.execute(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:1081)", "RUBY.fetch_rows(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/adapters/jdbc.rb:763)", "RUBY.with_sql_each(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:942)", "RUBY.with_sql_first(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:950)", "RUBY.single_record!(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:748)", "RUBY.first(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:248)", "RUBY.[](/usr/share/logstash/vendor/bundle/jruby/1.9/gems/sequel-4.48.0/lib/sequel/dataset/actions.rb:39)", "RUBY.get_placeholder(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:112)", "RUBY.update_watched_collections(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:160)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.update_watched_collections(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:158)", "RUBY.register(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-mongodb-0.4.1/lib/logstash/inputs/mongodb.rb:182)", "RUBY.register_plugin(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:281)", "RUBY.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:292)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:292)", "RUBY.start_inputs(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:442)", "RUBY.start_workers(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:336)", "RUBY.run(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:226)", "RUBY.start_pipeline(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:398)", "java.lang.Thread.run(java/lang/Thread.java:748)"]}
[2017-07-10T12:50:33,304][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-07-10T12:50:36,267][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

Thanks you so much.