Open from-nibly opened 2 years ago
Hey, there's nothing special about MSSQL and it's usually expected to work.
One thing worth also trying is to double check the added .jar permission in /usr/share/logstash/logstash-core/lib/jars
, are you sure the copying was done using the logstash user?
Maybe provide us the output of ls -l /usr/share/logstash/logstash-core/lib/jars
Is the adapter different from the driver?
The driver is the Java library that implements the JDBC standard. Sequel, the underlying library, uses the adapter terminology - different DBs have different adapters that the library uses to adapt (and provide specific) behavior e.g. a database specific way to set limit/offset for a query.
Is there a way to debug what is in the ADAPTER_MAP?
Sequel library attempting to load the jdbc/mssql
should only happen as a fallback when the driver class is n/a.
Am I installing the jdbc driver correctly?
If LS has the proper permission to read the .jar in /usr/share/logstash/logstash-core/lib/jars
than yes.
Is there any documentation for specifically mssql/sqlserver and it's particulars?
Nothing specific, we know some users the pluging with SQLServer and are doing fine.
Hello @from-nibly
I am not sure if issue got resolve or not, but here my takes.
Once I configured mssql-server on centos7 and populated database with sample table and row/column. I used below LS jdbc input configuration and I am able to see my events. If you look at the sample code provided by Microsoft in archive, it is mentioned the drive string. It should be "com.microsoft.sqlserver.jdbc.SQLServerDataSource"
input { jdbc { jdbc_driver_library => "/home/docker/sqljdbc_9.4/enu/mssql-jdbc-9.4.0.jre8.jar" jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDataSource" jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=TestDB" jdbc_user => "sa" jdbc_password => "Mysql@2022" schedule => "*/1 * * * *" statement => "SELECT * from dbo.Inventory where quantity > 152" } }
Event look like below.
{ "quantity" => 154, "@timestamp" => 2022-07-14T10:40:02.955Z, "name" => "orange", "@version" => "1", "id" => 2, }
Where name/Quantity is my Table data.
Best Regards.
Apologies, for the late response, we ended up abandoning this shortly after running into these issues. I wont have time to try the solutions and see if they work.
Logstash information:
8.0.0
docker
docker
docker
Dockerfile
openjdk 11.0.13 2021-10-19
docker
JAVA_HOME
environment variable if set.N/A
OS version
Linux nixos-rip 5.10.99 #1-NixOS SMP Tue Feb 8 17:30:41 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Description of the problem including expected versus actual behavior:
When I try to run an jdbc input plugin with the mssql/sqlserver jdbc driver it throws thw following error
Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP
Things I've Tried
Java::
from the class property/usr/share/logstash/logstash-core/lib/jars
folder.I've looked for documentation on this issue, and am coming up with dead ends on stack overflow et al.
I'm also having trouble finding information on what the ADAPTER_MAP is and how it gets populated.
Questions
Expectations
I would expect it to work with the configuration provided, assuming I'm not doing something obviously dumb here.
Steps to reproduce:
Dockerfile is above.
pipeline file mounted to
/usr/share/logstash/pipeline/
docker command
Log with error
[2022-02-23T17:09:00,477][ERROR][logstash.pluginmixins.jdbc.scheduler][main][3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769] Scheduler intercepted an error: {:exception=>Sequel::AdapterNotFound, :message=>"Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:97:in, @mean_work_time=0.05812200000000001, @count=3, @last_work_time=0.006668, @scheduler=#<LogStash::PluginMixins::Jdbc::Scheduler:0x74a4e05d @jobs=#<Rufus::Scheduler::JobArray:0x7dd41e87 @mutex=#, @array=[#<Rufus::Scheduler::CronJob:0x3f33c406 ...>]>, @scheduler_lock=#, @started_at=2022-02-23 17:06:32 +0000, @thread=#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769], @mutexes={}, @work_queue=#, @frequency=1.0, @_work_threads=[#<Thread:0x25a89003@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]], @paused=false, @trigger_lock=#, @opts={:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, @thread_key="rufus_scheduler_2054", @max_work_threads=1, @stderr=#<IO:>>, @paused_at=nil, @first_at=2022-02-23 17:06:32 +0000, @opts={}, @id="cron_1645635992.289568_4810128455882558930", @handler=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @original=" ">, :opts=>{:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, :started_at=>2022-02-23 17:06:32 +0000, :thread=>"#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]", :jobs_size=>1, :work_threads_size=>1, :work_queue_size=>0}
load_adapter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/adapters/jdbc.rb:378:in
adapter_initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/misc.rb:156:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:57:in
connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/core.rb:124:inconnect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:117:in
block in jdbc_connect'", "org/jruby/RubyKernel.java:1442:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in
jdbc_connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:157:inopen_jdbc_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:214:in
execute_statement'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:345:inexecute_query'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:308:in
block in run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:234:indo_call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:258:in
do_trigger'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:300:inblock in start_work_thread'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:299:in
block in start_work_thread'", "org/jruby/RubyKernel.java:1442:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:289:in
block in start_work_thread'"], :now=>"2022-02-23T17:09:00.476", :last_time=>"2022-02-23T17:09:00.473", :next_time=>"2022-02-23T17:10:00.000", :job=>#<Rufus::Scheduler::CronJob:0x3f33c406 @last_at=nil, @tags=[], @scheduled_at=2022-02-23 17:06:32 +0000, @cron_line=#<Rufus::Scheduler::CronLine:0x381e0c59 @timezone=nil, @weekdays=nil, @days=nil, @seconds=[0], @minutes=nil, @hours=nil, @months=nil, @monthdays=nil, @original=" ">, @last_time=2022-02-23 17:09:00 +0000, @times=nil, @locals={}, @unscheduled_at=nil, @callable=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @next_time=2022-02-23 17:10:00 +0000, @local_mutex=#