Open openchung opened 5 years ago
update about my issues. I change to version 0.8.0 gem still have error. Following is the stacks :
`2018-10-21 12:32:38 +0800 [info]: gem 'fluent-mixin-plaintextformatter' version '0.2.6'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-elasticsearch' version '1.17.2'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-genhashvalue' version '0.04'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-kafka' version '0.8.0'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-mongo' version '0.8.1'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '1.5.6'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-s3' version '0.8.5'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-scribe' version '0.10.14'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-td' version '0.10.29'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-td-monitoring' version '0.2.3'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-webhdfs' version '0.7.1'
2018-10-21 12:32:38 +0800 [info]: gem 'fluentd' version '0.12.40'
2018-10-21 12:32:38 +0800 [info]: adding match pattern="audit.*" type="kafka_buffered"
2018-10-21 12:32:38 +0800 [trace]: registered output plugin 'kafka_buffered'
2018-10-21 12:32:38 +0800 [info]: brokers has been set directly: ["192.168.5.129"]
2018-10-21 12:32:38 +0800 [info]: adding match pattern="**" type="stdout"
2018-10-21 12:32:38 +0800 [info]: adding source type="tail"
2018-10-21 12:32:38 +0800 [info]: using configuration file:
fetch_cluster_info' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in
cluster_info'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in refresh_metadata!' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in
add_target_topics'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in deliver_messages_with_retries' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in
block in deliver_messages'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in call' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in
instrument'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in deliver_messages' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in
deliver_messages'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in write' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in
write_chunk'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in pop' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in
try_flush'
/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in run' 2018-10-21 12:33:38 +0800 [info]: initialized kafka producer: kafka 2018-10-21 12:33:38 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2018-10-21 12:33:39 +0800 error_class="Kafka::ConnectionError" error="Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached" plugin_id="object:3f8168cb0bc8" 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:396:in
fetch_cluster_info'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in cluster_info' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in
refresh_metadata!'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in add_target_topics' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in
deliver_messages_with_retries'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in block in deliver_messages' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in
call'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in instrument' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in
deliver_messages'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in deliver_messages' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in
write'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in write_chunk' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in
pop'
2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in try_flush' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in
run'
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Sending sasl_handshake API request 1 to 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Waiting for response 1 from 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Received response 1 from 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"GSSAPI: Initializing context with 192.168.5.129:9092, principal kafka/cipkafka1t.testesunbank.com.tw@KAFKA.COM"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Sending topic_metadata API request 2 to 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Waiting for response 2 from 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"}
2018-10-21 12:33:38 +0800 fluent.error: {"message":"Failed to fetch metadata from kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"}
2018-10-21 12:33:38 +0800 fluent.warn: {"message":"Send exception occurred: Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"}
2018-10-21 12:33:38 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:396:in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in
cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in refresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in
add_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in deliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in
block in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in call'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in
instrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in
deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in write'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in
write_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in pop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in
try_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in run'"} 2018-10-21 12:33:38 +0800 fluent.info: {"message":"initialized kafka producer: kafka"} 2018-10-21 12:33:38 +0800 fluent.warn: {"next_retry":"2018-10-21 12:33:39 +0800","error_class":"Kafka::ConnectionError","error":"Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached","plugin_id":"object:3f8168cb0bc8","message":"temporarily failed to flush the buffer. next_retry=2018-10-21 12:33:39 +0800 error_class=\"Kafka::ConnectionError\" error=\"Could not connect to any of the seed brokers:\\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached\" plugin_id=\"object:3f8168cb0bc8\""}
Following is my local gems ` LOCAL GEMS
actionmailer (4.2.8) actionpack (4.2.8) actionview (4.2.8) activejob (4.2.8) activemodel (4.2.8) activerecord (4.2.8) activesupport (4.2.8) addressable (2.5.2, 2.5.1) arel (6.0.4) aws-sdk (2.10.45) aws-sdk-core (2.10.45) aws-sdk-resources (2.10.45) aws-sigv4 (1.0.2) base91 (0.0.1) bigdecimal (default: 1.2.4) bson (4.1.1) builder (3.2.3) bundler (1.14.5) celluloid (0.15.2) cool.io (1.5.1) diff-lcs (1.3) digest-crc (0.4.1) draper (1.4.0) elasticsearch (5.0.5) elasticsearch-api (5.0.5) elasticsearch-transport (5.0.5) erubis (2.7.0) excon (0.62.0) faraday (0.13.1) ffi (1.9.25) fluent-logger (0.7.1) fluent-mixin-plaintextformatter (0.2.6) fluent-plugin-elasticsearch (1.17.2) fluent-plugin-genhashvalue (0.04) fluent-plugin-kafka (0.8.0) fluent-plugin-mongo (0.8.1) fluent-plugin-rewrite-tag-filter (1.5.6) fluent-plugin-s3 (0.8.5) fluent-plugin-scribe (0.10.14) fluent-plugin-td (0.10.29) fluent-plugin-td-monitoring (0.2.3) fluent-plugin-webhdfs (0.7.1) fluentd (0.12.40) fluentd-ui (0.4.4) font-awesome-rails (4.7.0.1) globalid (0.4.0) gssapi (1.2.0) haml (4.0.7) haml-rails (0.5.3) hike (1.2.3) hirb (0.7.3) http_parser.rb (0.6.0) httpclient (2.8.2.4) i18n (0.8.1) io-console (default: 0.4.3) ipaddress (0.8.3) jbuilder (2.6.3) jmespath (1.3.1) jquery-rails (3.1.4) json (default: 1.8.1) kramdown (1.13.2) kramdown-haml (0.0.3) loofah (2.0.3) ltsv (0.1.0) mail (2.6.4) mime-types (3.1) mime-types-data (3.2016.0521) mini_portile2 (2.3.0, 2.1.0) minitest (5.10.1, default: 4.7.5) mixlib-cli (1.7.0) mixlib-config (2.2.4) mixlib-log (1.7.1) mixlib-shellout (2.2.7) mongo (2.2.7) msgpack (1.1.0) multi_json (1.12.1) multipart-post (2.0.0) nokogiri (1.8.1) ohai (6.20.0) oj (2.18.1) parallel (1.8.0) psych (default: 2.0.5) public_suffix (3.0.0, 2.0.5) puma (3.8.2) rack (1.6.5) rack-test (0.6.3) rails (4.2.8) rails-deprecated_sanitizer (1.0.3) rails-dom-testing (1.0.8) rails-html-sanitizer (1.0.3) railties (4.2.8) rake (default: 10.1.0) rdoc (default: 4.1.0) request_store (1.3.2) ruby-kafka (0.7.3) ruby-progressbar (1.8.3) rubyzip (1.2.1, 1.1.7) sass (3.2.19) sass-rails (4.0.5) settingslogic (2.0.9) sigdump (0.2.4) sprockets (2.12.4) sprockets-rails (2.3.3) string-scrub (0.0.5) sucker_punch (1.0.5) systemu (2.5.2) td (0.15.2) td-client (0.8.85) td-logger (0.3.27) test-unit (default: 2.1.10.0) thor (0.19.4) thread_safe (0.3.6) thrift (0.8.0) tilt (1.4.1) timers (1.1.0) tzinfo (1.2.3) tzinfo-data (1.2017.2) uuidtools (2.1.5) webhdfs (0.8.0) yajl-ruby (1.3.0) zip-zip (0.3)`
Please help me. Thanks los.
Does anyone know how to fix this problem? kerberos feature is contributed by user and it depends on ruby-kafka's kerberos support. So kerberos support should work on basic stetup but I'm not sure what is wrong.
I have asked the question in ruby-kafka's discussion issues. #670 .We have an urgent need for help.
@openchung I am facing the same issue for Kerberos with GSSAPI. Were you/anyone able to fix it?
Same exact issue in my environment. Is there success story of working GSSAPI?
same issue in my environment with keytab and principal but not krb5.conf.
i think the config is not completed ,you wont find KDC server addr without krb5.conf. any successful story?
I have same issue, I've confirmed keytab/principle with Kinit. Anyone get this working?
i think the config is not completed ,you wont find KDC server addr without krb5.conf. any successful story?
Solved?
This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days
This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days
I have the same problem. I've provided principal and keytab. Also I can validate using kinit that the kyetab and principal are valid, it is returning a Kerberos ticket. Any news or successful story?
@ashie could you please update if this issue is resolved? or if any workaround available for above issue then that will be helpful.
As long as the issue in the ruby library is not solved (https://github.com/zendesk/ruby-kafka/issues/670) this one can't be solved either :/
I'd suggest you to use rdkafka. It works just fine with Fluentd kerberos authN.
I use fluentd to send json log to sasl_ssl cloudera kafka , but I meet the following warn. So it cause send failed. I have verified my keytab and principal using kinit to verify .
2018-10-17 07:45:10 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2018-10-17 07:47:30 +0800 error_class="GSSAPI::GssApiError" error="gss_init_sec_context did not return GSS_S_COMPLETE" plugin_id="object:3f84a1db517c" 2018-10-17 07:45:10 +0800 [warn]: suppressed same stacktrace 2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Send exception occurred: gss_init_sec_context did not return GSS_S_COMPLETE"} 2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/gssapi-1.2.0/lib/gssapi/simple.rb:95:in
init_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:72:in
initialize_gssapi_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:25:inauthenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl_authenticator.rb:51:in
authenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/connection_builder.rb:27:inbuild_connection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:184:in
connection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:170:insend_request'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:44:in
fetch_metadata'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:386:inblock in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:in
each'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:infetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:in
cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:inrefresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:in
add_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:indeliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:in
block in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:incall'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in
instrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:indeliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:in
deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:inwrite'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in
write_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:inpop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in
try_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'"} 2018-10-17 07:45:10 +0800 fluent.info: {"message":"initialized kafka producer: kafka"} 2018-10-17 07:45:10 +0800 fluent.warn: {"next_retry":"2018-10-17 07:47:30 +0800","error_class":"GSSAPI::GssApiError","error":"gss_init_sec_context did not return GSS_S_COMPLETE","plugin_id":"object:3f84a1db517c","message":"temporarily failed to flush the buffer. next_retry=2018-10-17 07:47:30 +0800 error_class=\"GSSAPI::GssApiError\" error=\"gss_init_sec_context did not return GSS_S_COMPLETE\" plugin_id=\"object:3f84a1db517c\""}Following is my td-agent.conf
Following is my dependency actionmailer (4.2.8) actionpack (4.2.8) actionview (4.2.8) activejob (4.2.8) activemodel (4.2.8) activerecord (4.2.8) activesupport (4.2.8) addressable (2.5.2, 2.5.1) arel (6.0.4) aws-sdk (2.10.45) aws-sdk-core (2.10.45) aws-sdk-resources (2.10.45) aws-sigv4 (1.0.2) base91 (0.0.1) bigdecimal (default: 1.2.4) bson (4.1.1) builder (3.2.3) bundler (1.14.5) celluloid (0.15.2) cool.io (1.5.1) diff-lcs (1.3) draper (1.4.0) elasticsearch (5.0.5) elasticsearch-api (5.0.5) elasticsearch-transport (5.0.5) erubis (2.7.0) excon (0.62.0) faraday (0.13.1) ffi (1.9.25) fluent-logger (0.7.1) fluent-mixin-plaintextformatter (0.2.6) fluent-plugin-elasticsearch (1.17.2) fluent-plugin-genhashvalue (0.04) fluent-plugin-kafka (0.7.9, 0.6.1) fluent-plugin-mongo (0.8.1) fluent-plugin-rewrite-tag-filter (1.5.6) fluent-plugin-s3 (0.8.5) fluent-plugin-scribe (0.10.14) fluent-plugin-td (0.10.29) fluent-plugin-td-monitoring (0.2.3) fluent-plugin-webhdfs (0.7.1) fluentd (0.12.40) fluentd-ui (0.4.4) font-awesome-rails (4.7.0.1) globalid (0.4.0) gssapi (1.2.0) haml (4.0.7) haml-rails (0.5.3) hike (1.2.3) hirb (0.7.3) http_parser.rb (0.6.0) httpclient (2.8.2.4) i18n (0.8.1) io-console (default: 0.4.3) ipaddress (0.8.3) jbuilder (2.6.3) jmespath (1.3.1) jquery-rails (3.1.4) json (default: 1.8.1) kramdown (1.13.2) kramdown-haml (0.0.3) loofah (2.0.3) ltsv (0.1.0) mail (2.6.4) mime-types (3.1) mime-types-data (3.2016.0521) mini_portile2 (2.3.0, 2.1.0) minitest (5.10.1, default: 4.7.5) mixlib-cli (1.7.0) mixlib-config (2.2.4) mixlib-log (1.7.1) mixlib-shellout (2.2.7) mongo (2.2.7) msgpack (1.1.0) multi_json (1.12.1) multipart-post (2.0.0) nokogiri (1.8.1) ohai (6.20.0) oj (2.18.1) parallel (1.8.0) psych (default: 2.0.5) public_suffix (3.0.0, 2.0.5) puma (3.8.2) rack (1.6.5) rack-test (0.6.3) rails (4.2.8) rails-deprecated_sanitizer (1.0.3) rails-dom-testing (1.0.8) rails-html-sanitizer (1.0.3) railties (4.2.8) rake (default: 10.1.0) rdoc (default: 4.1.0) request_store (1.3.2) ruby-kafka (0.6.8) ruby-progressbar (1.8.3) rubyzip (1.2.1, 1.1.7) sass (3.2.19) sass-rails (4.0.5) settingslogic (2.0.9) sigdump (0.2.4) sprockets (2.12.4) sprockets-rails (2.3.3) string-scrub (0.0.5) sucker_punch (1.0.5) systemu (2.5.2) td (0.15.2) td-client (0.8.85) td-logger (0.3.27) test-unit (default: 2.1.10.0) thor (0.19.4) thread_safe (0.3.6) thrift (0.8.0) tilt (1.4.1) timers (1.1.0) tzinfo (1.2.3) tzinfo-data (1.2017.2) uuidtools (2.1.5) webhdfs (0.8.0) yajl-ruby (1.3.0) zip-zip (0.3)
Please help me.