elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
14.18k stars 3.5k forks source link

logstash-7.3.2 and elasticsearch-7.3.2 and mysql 8.0.13 please #11147

Open nikolamakin opened 5 years ago

nikolamakin commented 5 years ago

logstash-7.3.2 elasticsearch-7.3.2 mysql 8.0.13 ./bin/logstash-plugin install logstash-input-jdbc

./bin/logstash -f ./config/logstash.conf OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release. WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/home/logstash-7.3.2/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.FileDescriptor.fd WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release Thread.exclusive is deprecated, use Thread::Mutex Sending Logstash logs to /home/logstash-7.3.2/logs which is now configured via log4j2.properties [2019-09-18T04:26:23,393][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2019-09-18T04:26:23,411][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.2"} [2019-09-18T04:26:25,959][INFO ][org.reflections.Reflections] Reflections took 60 ms to scan 1 urls, producing 19 keys and 39 values [2019-09-18T04:26:27,327][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}} [2019-09-18T04:26:27,564][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"} [2019-09-18T04:26:27,634][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7} [2019-09-18T04:26:27,638][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: thetypeevent field won't be used to determine the document _type {:es_version=>7} [2019-09-18T04:26:27,667][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://127.0.0.1:9200"]} [2019-09-18T04:26:27,789][INFO ][logstash.outputs.elasticsearch] Using default mapping template [2019-09-18T04:26:27,800][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2019-09-18T04:26:27,812][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x71b9d016 run>"} [2019-09-18T04:26:27,914][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [2019-09-18T04:26:28,121][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"} [2019-09-18T04:26:28,249][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2019-09-18T04:26:28,880][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/cronline.rb:77: warning: constant ::Fixnum is deprecated [2019-09-18T04:27:00,735][ERROR][logstash.inputs.jdbc ] Failed to load /home/logstash-7.3.2/config/mysql-connector-java-8.0.17.jar {:exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>} { 2012 rufus-scheduler intercepted an error: 2012 job: 2012 Rufus::Scheduler::CronJob "* * * * *" {} 2012 error: 2012 2012 2012 LogStash::ConfigurationError 2012 com.mysql.cj.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library? 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:inopen_jdbc_connection' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in execute_statement' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:inexecute_query' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:258:in block in run' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:234:indo_call' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:258:in do_trigger' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:300:inblock in start_work_thread' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:299:in block in start_work_thread' 2012 org/jruby/RubyKernel.java:1425:inloop' 2012 /home/logstash-7.3.2/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:289:in block in start_work_thread'

vim config/logstash.conf: input { jdbc { jdbc_driver_library => "/home/logstash-7.3.2/config/mysql-connector-java-8.0.17.jar" jdbc_driver_class => "com.mysql.cj.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/database" jdbc_user => "root" jdbc_password => "root" statement => "select id,title from data where id > :sql_last_time" jdbc_paging_enabled => "true" jdbc_page_size => "50000" schedule => " *" } }

i try it: mysql-connector-java-5.1.48-bin.jar mysql-connector-java-5.1.48.jar mysql-connector-java-8.0.17.jar mysql-connector-java-8.0.13.jar

openjdk version "12.0.2" 2019-07-16 OpenJDK Runtime Environment (build 12.0.2+10) OpenJDK 64-Bit Server VM (build 12.0.2+10, mixed mode, sharing)

No success at all. please help me. thanks My English is not good Sorry . guys !

giammarcoacn commented 4 years ago

I have the same problem(as a lot of users on forums, without replies). mySqlConnector is compatible up to JDK 8. I don't know the reason to update openJDK with the latest version available...

marianoarga commented 4 years ago

Same problem here

robbavey commented 4 years ago

This issue should have been fixed with the latest release of the JDBC input plugin.

Upgrading the jdbc input plugin to 4.3.18 should fix the issue

ChengHoHang commented 4 years ago

@robbavey Thanks for your replying! I, a beginner, have the same problem today. Can you tell more details about how to upgrade the plugins?
its version is 7.3.0, run in docker