elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
14.2k stars 3.5k forks source link

Logstash cant connect to Elasticsearch on 127.0.0.1 #4131

Closed setswei closed 2 years ago

setswei commented 8 years ago

i am doing a brand new clean install of Logstash and Elasticsearch ( version 2.0) and i am having issues getting logstash to connect to the elasticsearch server on the same host, there is no firewall installed and

i can return elasticsearch server information using curl, any help/guidance would be greatly appreciated

these are the errors i get

:timestamp=>"2015-11-03T13:20:18.499000+1000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://127.0.0.1:9200/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :client_config=>{:hosts=>["http://127.0.0.1:9200/"], :ssl=>nil, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>false, :randomize_hosts=>false}, :error_message=>"Failed to load class 'org.jruby.RubyObject$Access4JacksonDeserializer4c401575': com.fasterxml.jackson.module.afterburner.ser.BeanPropertyAccessor", :error_class=>"JrJackson::ParseError", :backtrace=>["com/jrjackson/JrJacksonBase.java:83:ingenerate'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.6/lib/jrjackson/jrjackson.rb:59:in dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapters/jr_jackson.rb:20:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapter.rb:25:in dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json.rb:136:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in __bulkify'", "org/jruby/RubyArray.java:2414:inmap'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in __bulkify'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:82:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:56:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:353:insubmit'", "org/jruby/ext/thread/Mutex.java:149:in synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:350:insubmit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:382:in flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:inbuffer_flush'", "org/jruby/RubyHash.java:1342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:inbuffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:112:inbuffer_initialize'", "org/jruby/RubyKernel.java:1479:in loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:110:inbuffer_initialize'"], :level=>:error}`

and this aswell

{:timestamp=>"2015-11-03T13:20:18.512000+1000", :message=>"Failed to flush outgoing items", :outgoing_count=>72, :exception=>"JrJackson::ParseError", :backtrace=>["com/jrjackson/JrJacksonBase.java:83:ingenerate'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.6/lib/jrjackson/jrjackson.rb:59:in dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapters/jr_jackson.rb:20:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapter.rb:25:in dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json.rb:136:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in __bulkify'", "org/jruby/RubyArray.java:2414:inmap'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in __bulkify'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:82:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:56:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:353:insubmit'", "org/jruby/ext/thread/Mutex.java:149:in synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:350:insubmit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:382:in flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:inbuffer_flush'", "org/jruby/RubyHash.java:1342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:inbuffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:112:inbuffer_initialize'", "org/jruby/RubyKernel.java:1479:in loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:110:inbuffer_initialize'"], :level=>:warn}`

here is a copy of my input and output configuration

02-input-logcourier.conf

input {
    courier {
        port            => 4400
        transport       => "tcp"
    }

    tcp {
        port => 4401
        type => "proxy"
    }
}

99-output.conf

output {
    if "_jsonparsefailure" in [tags] or "_grokparsefailure" in [tags] {
        file {
            path => "/var/log/logstash/failures.log"
        }
    } else if "_ignore" in [tags] {
        # Do nothing

    } else {
        if [type] == "proxy" {
           elasticsearch {
                hosts              => ["127.0.0.1:9200"]
                index              => "proxy-%{+YYYY.MM.dd}"
                template           => "/etc/logstash/es-templates/template-nginx-proxy.json"
                template_name      => "proxy"
                template_overwrite => true
            }
        } else {
            # not a type failure, but type still not supported
            file {
                path => "/var/log/logstash/failures.log"
            }
        }
    }
} 
purbon commented 8 years ago

@setswei @McStork thanks a lot for your report. I wonder if you can provide an example log line that would help reproducing.

purbon commented 8 years ago

Could you please share more info about your env? can you gather some example log lines? would like to reproduce.

On Tue, Nov 3, 2015 at 10:22 AM McStork notifications@github.com wrote:

I have two Logstash servers in loadbalance feeding ES and these errors happen only one of the two LS.

— Reply to this email directly or view it on GitHub https://github.com/elastic/logstash/issues/4131#issuecomment-153294993.

McStork commented 8 years ago

Sorry, i deleted my last post because the 'message' I have is the same but the exception is different than @setswei . I will try to find the reason and I might post another topic if I can't resolve it.

setswei commented 8 years ago

hey @purbon

I have built a brand new server for this one so i am not affecting production

its a ubuntu 14.04.3 64bit server

i am using log courier to ship the logs to this machine which has logstash kibana and elasticsearch installed on it ( i have installed the log-courier plugin on logstash indexer)

here is a example of some of the logs i am trying to push to elasticsearch

[04/Nov/2015:07:49:06 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 103.251.169.64] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:49:09 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:49:30 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:49:50 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:49:56 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 103.251.169.64] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:50:10 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:50:30 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:50:46 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 103.251.169.64] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:50:49 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
[04/Nov/2015:07:51:10 +1000] - 502 502 - POST https app1.domain.com "/api/a481e0a7c0bac36e95a66d5fe4b44bb4" [Client 192.168.0.160] [Length 568] [Gzip -] [Session ] [Sent-to 192.168.0.150] "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36" "-"
suyograo commented 8 years ago

/cc @driskell

See https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/294#issuecomment-153504881

purbon commented 8 years ago

Thanks @suyograo & @setswei for the intel, I agree with @suyograo this looks like the bug shared from the elasticsearch output plugin.

//cc @guyboertje can you take a look there? looks like a jrjackson thing.

driskell commented 8 years ago

Reproduced issue with pure JrJackson here: guyboertje/jrjackson#46

driskell commented 8 years ago

I'm looking at releasing an update to Courier plugin soon to workaround this issue

setswei commented 8 years ago

thanks @driskell

jc21 commented 8 years ago

Hi @driskell do you have an eta for the next plugin release?

riguy724 commented 8 years ago

This is really preventing our entire ELK solution from working right now. Any update on the status of this issue?

setswei commented 8 years ago

I have moved back to log stash forwarder for now until this is resolved

On Monday, 21 December 2015, Christopher Riley notifications@github.com wrote:

This is really preventing our entire ELK solution from working right now. Any update on the status of this issue?

— Reply to this email directly or view it on GitHub https://github.com/elastic/logstash/issues/4131#issuecomment-166158328.

Thanks Kyle Hartigan

erikanderson commented 8 years ago

Also experiencing this intermittently on our deployment:

{: timestamp =>
        "2016-01-22T11:24:08.129000-0700", :
        message =>
        "Attempted to send a bulk request to Elasticsearch configured at '[\"http://127.0.0.1:9200/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :
        client_config => {: hosts => [
                    "http://127.0.0.1:9200/"
                ], : ssl => nil, :
                transport_options => {:
                    socket_timeout => 0, :
                        request_timeout =>
                        0, : proxy =>
                        nil, : ssl => {}
                }, : transport_class =>
                Elasticsearch::
                Transport::Transport::
                HTTP::Manticore, :
                logger => nil, : tracer =>
                nil, :
                reload_connections =>
                false, :
                retry_on_failure =>
                false, :
                reload_on_failure =>
                false, :
                randomize_hosts =>
                false
        }, : error_message =>
        "127.0.0.1:9200 failed to respond", :
        error_class =>
        "Manticore::ClientProtocolException", :
        backtrace => [
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:35:in `initialize'",
            "org/jruby/RubyProc.java:281:in `call'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:70:in `call'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:245:in `call_once'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:148:in `code'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:71:in `perform_request'",
            "org/jruby/RubyProc.java:281:in `call'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:201:in `perform_request'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/client.rb:125:in `perform_request'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.15/lib/elasticsearch/api/actions/bulk.rb:87:in `bulk'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:66:in `non_threadsafe_bulk'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'",
            "org/jruby/ext/thread/Mutex.java:149:in `synchronize'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:158:in `safe_bulk'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:100:in `submit'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in `retrying_submit'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:56:in `setup_buffer_and_handler'",
            "org/jruby/RubyProc.java:281:in `call'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:109:in `flush_unsafe'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:93:in `interval_flush'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:82:in `spawn_interval_flusher'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:63:in `synchronize'",
            "org/jruby/ext/thread/Mutex.java:149:in `synchronize'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:63:in `synchronize'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:82:in `spawn_interval_flusher'",
            "org/jruby/RubyKernel.java:1479:in `loop'",
            "/srv/logstash/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/buffer.rb:79:in `spawn_interval_flusher'"
        ], : level => : error
}
guyboertje commented 8 years ago

@erikanderson - Your problem seems different from that of the original poster.

What is your LS config?

robbydaigle commented 8 years ago

Did this ever get resolved? Running into something similar.

GM12Tick commented 8 years ago

Same here +1, Can't connect my logstash output to elasticsearch that has diff port than 9200

untergeek commented 8 years ago

How did we get stuck discussing this here instead of at https://github.com/logstash-plugins/logstash-output-elasticsearch ?

hitxueliang commented 8 years ago

get the same error ,my elasticsearch is 2.3.3 and logstash is 2.3.2,they work correctlly ,but together not right.

aaratn commented 8 years ago

Hi,

I am getting same errors with my ELK installation. Here's the message that I am seeing. I am wondering if there's a workaround to overcome this error.

{:timestamp=>"2016-06-02T09:17:20.561000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :error_message=>"URI does not specify a valid host name: http://::1/_bulk", :error_class=>"Manticore::ClientProtocolException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.5-java/lib/manticore/response.rb:37:ininitialize'", "org/jruby/RubyProc.java:281:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.5-java/lib/manticore/response.rb:79:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.5-java/lib/manticore/response.rb:256:in call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.5.5-java/lib/manticore/response.rb:153:incode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:71:in perform_request'", "org/jruby/RubyProc.java:281:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:201:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:54:inperform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/client.rb:125:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.15/lib/elasticsearch/api/actions/bulk.rb:87:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:inbulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:163:in safe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:101:insubmit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:29:inmulti_receive'", "org/jruby/RubyArray.java:1653:in each_slice'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.5-java/lib/logstash/outputs/elasticsearch/common.rb:28:inmulti_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/output_delegator.rb:114:inmulti_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:293:in output_batch'", "org/jruby/RubyHash.java:1342:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:293:in output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:224:inworker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.4-java/lib/logstash/pipeline.rb:193:in start_workers'"], :client_config=>{:hosts=>["http://localhost:9200/"], :ssl=>nil, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>false, :randomize_hosts=>false}, :level=>:error}

jsvd commented 2 years ago

Many things have changed since the issue was created, and this issue itself contains multiple reports that may have related to different issues. Closing now, please open a new issue if necessary.