Closed tinder-marktsujihara closed 5 years ago
the Gemfile after plugin installation: cat Gemfile
source "https://rubygems.org" gem "logstash-core", :path => "./logstash-core" gem "logstash-core-plugin-api", :path => "./logstash-core-plugin-api" gem "paquet", "~> 0.2.0" gem "ruby-progressbar", "~> 1.8.1" gem "builder", "~> 3.2.2" gem "ci_reporter_rspec", "1.0.0", :group => :development gem "tins", "1.6", :group => :development gem "rspec", "~> 3.5", :group => :development gem "logstash-devutils", "= 1.3.5", :group => :development gem "benchmark-ips", :group => :development gem "octokit", "3.8.0", :group => :build gem "stud", "~> 0.0.22", :group => :build gem "fpm", "~> 1.3.3", :group => :build gem "rubyzip", "~> 1.2.1", :group => :build gem "gems", "~> 0.8.3", :group => :build gem "rack-test", :require => "rack/test", :group => :development gem "flores", "~> 0.0.6", :group => :development gem "term-ansicolor", "~> 1.3.2", :group => :development gem "json-schema", "~> 2.6", :group => :development gem "belzebuth", :group => :development gem "pleaserun", "~>0.0.28" gem "webrick", "~> 1.3.1" gem "atomic", "<= 1.1.99" gem "rake", "~> 12.2.1", :group => :build gem "logstash-codec-cef" gem "logstash-codec-collectd" gem "logstash-codec-dots" gem "logstash-codec-edn" gem "logstash-codec-edn_lines" gem "logstash-codec-es_bulk" gem "logstash-codec-fluent" gem "logstash-codec-graphite" gem "logstash-codec-json" gem "logstash-codec-json_lines" gem "logstash-codec-line" gem "logstash-codec-msgpack" gem "logstash-codec-multiline" gem "logstash-codec-netflow" gem "logstash-codec-plain" gem "logstash-codec-rubydebug" gem "logstash-filter-aggregate" gem "logstash-filter-anonymize" gem "logstash-filter-cidr" gem "logstash-filter-clone" gem "logstash-filter-csv" gem "logstash-filter-date" gem "logstash-filter-de_dot" gem "logstash-filter-dissect" gem "logstash-filter-dns" gem "logstash-filter-drop" gem "logstash-filter-elasticsearch" gem "logstash-filter-fingerprint" gem "logstash-filter-geoip" gem "logstash-filter-grok" gem "logstash-filter-jdbc_static" gem "logstash-filter-jdbc_streaming" gem "logstash-filter-json" gem "logstash-filter-kv" gem "logstash-filter-metrics" gem "logstash-filter-mutate" gem "logstash-filter-ruby" gem "logstash-filter-sleep" gem "logstash-filter-split" gem "logstash-filter-syslog_pri" gem "logstash-filter-throttle" gem "logstash-filter-translate" gem "logstash-filter-truncate" gem "logstash-filter-urldecode" gem "logstash-filter-useragent" gem "logstash-filter-xml" gem "logstash-input-beats" gem "logstash-input-azure_event_hubs" gem "logstash-input-dead_letter_queue" gem "logstash-input-elasticsearch" gem "logstash-input-exec" gem "logstash-input-file" gem "logstash-input-ganglia" gem "logstash-input-gelf" gem "logstash-input-generator" gem "logstash-input-graphite" gem "logstash-input-heartbeat" gem "logstash-input-http" gem "logstash-input-http_poller" gem "logstash-input-imap" gem "logstash-input-jdbc" gem "logstash-input-kafka" gem "logstash-input-pipe" gem "logstash-input-rabbitmq" gem "logstash-input-redis" gem "logstash-input-s3" gem "logstash-input-snmp" gem "logstash-input-snmptrap" gem "logstash-input-sqs" gem "logstash-input-stdin" gem "logstash-input-syslog" gem "logstash-input-tcp" gem "logstash-input-twitter" gem "logstash-input-udp" gem "logstash-input-unix" gem "logstash-output-elastic_app_search" gem "logstash-output-cloudwatch" gem "logstash-output-csv" gem "logstash-output-elasticsearch" gem "logstash-output-email" gem "logstash-output-file" gem "logstash-output-graphite" gem "logstash-output-http" gem "logstash-output-kafka" gem "logstash-output-lumberjack" gem "logstash-output-nagios" gem "logstash-output-null" gem "logstash-output-pagerduty" gem "logstash-output-pipe" gem "logstash-output-rabbitmq" gem "logstash-output-redis" gem "logstash-output-s3", ">=4.0.9", "<5.0.0" gem "logstash-output-sns" gem "logstash-output-sqs" gem "logstash-output-stdout" gem "logstash-output-tcp" gem "logstash-output-udp" gem "logstash-output-webhdfs" gem "logstash-input-okta_system_log", "0.9.0", :path => "vendor/local_gems/896ec473/logstash-input-okta_system_log-0.9.0"
Never mind, there was an error with my build process, got it working, thanks!
Glad it worked out!
I am building a logstash Docker container with the Okta plugin installed. The Dockerfile builds fine, and the container starts, but says it cannot find the okta_system_log plugin. Is there a specific version of Logstash recommended?
Dockerfile: FROM docker.elastic.co/logstash/logstash:6.5.4 ADD ./logstash-input-okta_system_log-master /usr/share/logstash/logstash-input-okta_system_log-master USER root RUN yum install ruby ruby-bundler -y USER logstash RUN gem build /usr/share/logstash/logstash-input-okta_system_log-master/logstash-input-okta_system_log.gemspec RUN bin/logstash-plugin install /usr/share/logstash/logstash-input-okta_system_log-0.9.0.gem ENTRYPOINT [ "/usr/local/bin/docker-entrypoint" ]
Stdout Logs: Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console [INFO ] 2019-01-30 20:32:59.003 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"} [INFO ] 2019-01-30 20:32:59.010 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"} [WARN ] 2019-01-30 20:32:59.390 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified [INFO ] 2019-01-30 20:32:59.399 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.5.4"} [INFO ] 2019-01-30 20:32:59.419 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"939bfca3-59d8-404b-8687-13a7a7b785ac", :path=>"/usr/share/logstash/data/uuid"} [WARN ] 2019-01-30 20:32:59.858 [LogStash::Runner] pipelineregisterhook - xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set] registry - Tried to load a plugin's code, but failed. {:exception=>#<LoadError: no such file to load -- logstash/inputs/okta_system_log>, :path=>"logstash/inputs/okta_system_log", :type=>"input", :name=>"okta_system_log"}
[ERROR] 2019-01-30 20:33:02.516 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::PluginLoadingError", :message=>"Couldn't find any input plugin named 'okta_system_log'. Are you sure this is correct? Trying to load the okta_system_log input plugin resulted in this error: no such file to load -- logstash/inputs/okta_system_log", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:211:in '", "org/jruby/RubyKernel.java:994:in "}
[INFO ] 2019-01-30 20:33:03.634 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
xpack.monitoring.enabled: true
in logstash.yml [INFO ] 2019-01-30 20:33:00.662 [LogStash::Runner] licensereader - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://redacted:9200/]}} [WARN ] 2019-01-30 20:33:00.842 [LogStash::Runner] licensereader - Restored connection to ES instance {:url=>"http://redacted:9200/"} [INFO ] 2019-01-30 20:33:01.008 [LogStash::Runner] licensereader - ES Output version determined {:es_version=>6} [WARN ] 2019-01-30 20:33:01.010 [LogStash::Runner] licensereader - Detected a 6.x and above cluster: thetype
event field won't be used to determine the document _type {:es_version=>6} [INFO ] 2019-01-30 20:33:01.085 [LogStash::Runner] internalpipelinesource - Monitoring License OK [INFO ] 2019-01-30 20:33:01.087 [LogStash::Runner] internalpipelinesource - Validated license for monitoring. Enabling monitoring pipeline. [ERROR] 2019-01-30 20:33:02.496 [Converge PipelineAction::Createlookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:137:in
lookup'", "org/logstash/plugins/PluginFactoryExt.java:210:inplugin'", "org/logstash/plugins/PluginFactoryExt.java:166:in
plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:71:inplugin'", "(eval):8:in
eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in
initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42:in
block in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:inblock in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in
synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:inexclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in
execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:317:inblock in converge_state'"]} [WARN ] 2019-01-30 20:33:02.975 [Converge PipelineAction::Create<.monitoring-logstash>] elasticsearch - You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", hosts=>[http://redacted:9200], sniffing=>false, manage_template=>false, id=>"e9bc7b42971fffebfe2dc463e3c7e70cdd73ca6715ae92794955b773b9bfbbfb", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_c5389b87-afdc-4920-be26-f06f2561725c", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>} [INFO ] 2019-01-30 20:33:03.039 [Converge PipelineAction::Create<.monitoring-logstash>] pipeline - Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50} [INFO ] 2019-01-30 20:33:03.132 [[.monitoring-logstash]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://redacted:9200/]}} [WARN ] 2019-01-30 20:33:03.194 [[.monitoring-logstash]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://redacted:9200/"} [INFO ] 2019-01-30 20:33:03.201 [[.monitoring-logstash]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6} [WARN ] 2019-01-30 20:33:03.202 [[.monitoring-logstash]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the
typeevent field won't be used to determine the document _type {:es_version=>6} [ERROR] 2019-01-30 20:33:03.209 [Converge PipelineAction::Create<main>] registry - Tried to load a plugin's code, but failed. {:exception=>#<LoadError: no such file to load -- logstash/inputs/okta_system_log>, :path=>"logstash/inputs/okta_system_log", :type=>"input", :name=>"okta_system_log"} [ERROR] 2019-01-30 20:33:03.210 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::PluginLoadingError", :message=>"Couldn't find any input plugin named 'okta_system_log'. Are you sure this is correct? Trying to load the okta_system_log input plugin resulted in this error: no such file to load -- logstash/inputs/okta_system_log", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:211:in
lookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:137:inlookup'", "org/logstash/plugins/PluginFactoryExt.java:210:in
plugin'", "org/logstash/plugins/PluginFactoryExt.java:166:inplugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:71:in
plugin'", "(eval):8:in<eval>'", "org/jruby/RubyKernel.java:994:in
eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in
initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42:inblock in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:in
block in exclusive'", "org/jruby/ext/thread/Mutex.java:148:insynchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:in
exclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:317:in
block in converge_state'"]} [INFO ] 2019-01-30 20:33:03.215 [[.monitoring-logstash]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://redacted:9200"]} [INFO ] 2019-01-30 20:33:03.355 [Converge PipelineAction::Create<.monitoring-logstash>] pipeline - Pipeline started successfully {:pipeline_id=>".monitoring-logstash", :thread=>"#logstash.yml: | http.host: "0.0.0.0" path.config: /usr/share/logstash/pipeline xpack.monitoring.elasticsearch.url: http://redacted:9200 xpack.monitoring.enabled: true
logstash.conf: | input { okta_system_log { schedule => { every => "30s" } limit => 1000 auth_token_key => "${OKTA_API_KEY}" hostname => "redacted" } }