-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP #104
Comments
Hey, there's nothing special about MSSQL and it's usually expected to work. Maybe provide us the output of
The driver is the Java library that implements the JDBC standard.
Sequel library attempting to load the
If LS has the proper permission to read the .jar in
Nothing specific, we know some users the pluging with SQLServer and are doing fine. |
Hello @from-nibly I am not sure if issue got resolve or not, but here my takes. Once I configured mssql-server on centos7 and populated database with sample table and row/column.
Event look like below.
Best Regards. |
Apologies, for the late response, we ended up abandoning this shortly after running into these issues. I wont have time to try the solutions and see if they work. |
Logstash information:
8.0.0
docker
docker
docker
Dockerfile
openjdk 11.0.13 2021-10-19
docker
JAVA_HOME
environment variable if set.N/A
OS version
Linux nixos-rip 5.10.99 #1-NixOS SMP Tue Feb 8 17:30:41 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Description of the problem including expected versus actual behavior:
When I try to run an jdbc input plugin with the mssql/sqlserver jdbc driver it throws thw following error
Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP
Things I've Tried
Java::
from the class property/usr/share/logstash/logstash-core/lib/jars
folder.I've looked for documentation on this issue, and am coming up with dead ends on stack overflow et al.
I'm also having trouble finding information on what the ADAPTER_MAP is and how it gets populated.
Questions
Expectations
I would expect it to work with the configuration provided, assuming I'm not doing something obviously dumb here.
Steps to reproduce:
Dockerfile is above.
pipeline file mounted to
/usr/share/logstash/pipeline/
docker command
Log with error
[2022-02-23T17:09:00,477][ERROR][logstash.pluginmixins.jdbc.scheduler][main][3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769] Scheduler intercepted an error: {:exception=>Sequel::AdapterNotFound, :message=>"Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:97:in
load_adapter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/adapters/jdbc.rb:378:in
adapter_initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/misc.rb:156:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:57:in
connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/core.rb:124:inconnect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:117:in
block in jdbc_connect'", "org/jruby/RubyKernel.java:1442:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in
jdbc_connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:157:inopen_jdbc_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:214:in
execute_statement'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:345:inexecute_query'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:308:in
block in run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:234:indo_call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:258:in
do_trigger'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:300:inblock in start_work_thread'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:299:in
block in start_work_thread'", "org/jruby/RubyKernel.java:1442:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:289:in
block in start_work_thread'"], :now=>"2022-02-23T17:09:00.476", :last_time=>"2022-02-23T17:09:00.473", :next_time=>"2022-02-23T17:10:00.000", :job=>#<Rufus::Scheduler::CronJob:0x3f33c406 @last_at=nil, @tags=[], @scheduled_at=2022-02-23 17:06:32 +0000, @cron_line=#<Rufus::Scheduler::CronLine:0x381e0c59 @Timezone=nil, @weekdays=nil, @DayS=nil, @seconds=[0], @minutes=nil, @Hours=nil, @months=nil, @monthdays=nil, @original="* * * * ">, @last_time=2022-02-23 17:09:00 +0000, @times=nil, @Locals={}, @unscheduled_at=nil, @callable=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @next_time=2022-02-23 17:10:00 +0000, @local_mutex=#Thread::Mutex:0x3c105f42, @mean_work_time=0.05812200000000001, @count=3, @last_work_time=0.006668, @Scheduler=#<LogStash::PluginMixins::Jdbc::Scheduler:0x74a4e05d @jobs=#<Rufus::Scheduler::JobArray:0x7dd41e87 @mutex=#Thread::Mutex:0xe9dbcbe, @array=[#<Rufus::Scheduler::CronJob:0x3f33c406 ...>]>, @scheduler_lock=#Rufus::Scheduler::NullLock:0x63bca02c, @started_at=2022-02-23 17:06:32 +0000, @thread=#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler sleep>, @mutexes={}, @work_queue=#Thread::Queue:0x3ec1aafa, @frequency=1.0, @_work_threads=[#<Thread:0x25a89003@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler_worker-00 run>], @Paused=false, @trigger_lock=#Rufus::Scheduler::NullLock:0x2393e95a, @opts={:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, @thread_key="rufus_scheduler_2054", @max_work_threads=1, @stderr=#<IO:>>, @paused_at=nil, @first_at=2022-02-23 17:06:32 +0000, @opts={}, @id="cron_1645635992.289568_4810128455882558930", @handler=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @original=" * * * *">, :opts=>{:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, :started_at=>2022-02-23 17:06:32 +0000, :thread=>"#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler sleep>", :jobs_size=>1, :work_threads_size=>1, :work_queue_size=>0}The text was updated successfully, but these errors were encountered: