telekom-security / tpotce

🍯 T-Pot - The All In One Multi Honeypot Platform 🐝
GNU General Public License v3.0
6.82k stars 1.08k forks source link

logstash error for fresh installs #1336

Closed cpyix closed 1 year ago

cpyix commented 1 year ago

Before you post your issue make sure it has not been answered yet and provide basic support information if you come to the conclusion it is a new issue.




⚠️ Basic support information (commands are expected to run as root)

Logstash is not starting up after a fresh install. Seems to be releated to the version upgrade yesterday in commit 1a2d34c .

Error from logstash container: [ERROR] 2023-05-31 12:24:39.310 [[logstash]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"logstash", :exception=>#<LogStash::Filters::Dictionary::DictionaryFileError: Translate: The incoming YAML document exceeds the limit: 3145728 code points. when loading dictionary file at /etc/listbot/iprep.yaml>, :backtrace=>["org.yaml.snakeyaml.scanner.ScannerImpl.fetchMoreTokens(ScannerImpl.java:342)", "org.yaml.snakeyaml.scanner.ScannerImpl.checkToken(ScannerImpl.java:263)", "org.yaml.snakeyaml.parser.ParserImpl$ParseBlockMappingKey.produce(ParserImpl.java:662)", "org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:185)", "org.yaml.snakeyaml.parser.ParserImpl.getEvent(ParserImpl.java:195)", "org.jruby.ext.psych.PsychParser.parse(PsychParser.java:210)", "org.jruby.ext.psych.PsychParser$INVOKER$i$parse.call(PsychParser$INVOKER$i$parse.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:204)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:325)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:86)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:201)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:188)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:218)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:173)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:316)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:142)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.RubyMethod.call(RubyMethod.java:116)", "org.jruby.RubyMethod$INVOKER$i$call.call(RubyMethod$INVOKER$i$call.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrNBlock.call(JavaMethod.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:142)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:86)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:201)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:188)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:218)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:173)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:316)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:85)", "org.jruby.RubyClass.newInstance(RubyClass.java:911)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:549)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:85)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:549)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.RubyClass.finvoke(RubyClass.java:572)", "org.jruby.runtime.Helpers.invoke(Helpers.java:649)", "org.jruby.RubyBasicObject.callMethod(RubyBasicObject.java:348)", "org.logstash.config.ir.compiler.FilterDelegatorExt.doRegister(FilterDelegatorExt.java:88)", "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.register(AbstractFilterDelegatorExt.java:75)", "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt$INVOKER$i$0$0$register.call(AbstractFilterDelegatorExt$INVOKER$i$0$0$register.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:142)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.doYield(IRBlockBody.java:170)", "org.jruby.runtime.BlockBody.yield(BlockBody.java:108)", "org.jruby.runtime.Block.yield(Block.java:188)", "org.jruby.RubyArray.each(RubyArray.java:1865)", "org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:560)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:85)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:94)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:546)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:86)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:201)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:188)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:218)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:372)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:175)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:316)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:351)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:144)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:351)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:144)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:164)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:151)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:351)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:144)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)", "java.base/java.lang.Thread.run(Thread.java:833)"], "pipeline.sources"=>["/etc/logstash/logstash.conf"], :thread=>"#<Thread:0xb35cb4c@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:131 run>"} [INFO ] 2023-05-31 12:24:39.311 [[logstash]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>"logstash"} [ERROR] 2023-05-31 12:24:39.318 [Converge PipelineAction::Create] agent - Failed to execute action {:id=>:logstash, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

No data shown in kibana. Error in kibana: No matching indices found: No indices match "logstash-*"

reneinfrateam commented 1 year ago

I also hit this today - Converting the iprep.yaml to json and changing the translate to load that file instead helps.

May not be the right solution, but it works :-)

t3chn0m4g3 commented 1 year ago

Thank you, I will look into it. Tests ran fine and did not pick up any exceptions though.

t3chn0m4g3 commented 1 year ago

Great, error described here. During testing the file was of course not exceeding the limit.

t3chn0m4g3 commented 1 year ago

@cpyix, @reneinfrateam I re-created the logstash image with the updated docker logstash-filter-translate as a quick fix to avoid upgrading to 8.7.x for now. Please let me know if this fixes it on your end as well.

2GT-Rich commented 1 year ago

Can confirm after update.sh errors are gone, indices appear in elasticvue.

t3chn0m4g3 commented 1 year ago

@2GT-Rich Thanks for the swift feedback!

2GT-Rich commented 1 year ago

@2GT-Rich Thanks for the swift feedback!

Thank you for the quick resolution!

cpyix commented 1 year ago

Works fine now. Thanks !

vordenken commented 1 year ago

I don't know if this is the problem my installation has but my logstash also doesnt work anymore. CPU usage is nearly 100% all the time. I ran the update script but it didn't fix it.

Here are the logs from the log stash container:

Connection to Listbot looks good, now downloading latest translation maps.

06/01 07:35:25 [NOTICE] Downloading 1 item(s)

06/01 07:35:26 [NOTICE] Download complete: /etc/listbot/cve.yaml.bz2

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
c5fa0c|OK  |   1.0MiB/s|/etc/listbot/cve.yaml.bz2

Status Legend:
(OK):download completed.

06/01 07:35:26 [NOTICE] Downloading 1 item(s)

06/01 07:35:26 [NOTICE] Download complete: /etc/listbot/iprep.yaml.bz2

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
fc06e1|OK  |   5.4MiB/s|/etc/listbot/iprep.yaml.bz2

Status Legend:
(OK):download completed.
T-Pot ILM already configured or ES not available.

Using bundled JDK: /usr/share/logstash/jdk
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2023-06-01 07:36:37.665 [main] runner - Starting Logstash {"logstash.version"=>"8.6.2", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.6+10 on 17.0.6+10 +indy +jit [x86_64-linux]"}
[INFO ] 2023-06-01 07:36:37.679 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Xms1024m, -Xmx1024m, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[INFO ] 2023-06-01 07:36:37.712 [main] settings - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[INFO ] 2023-06-01 07:36:37.715 [main] settings - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[INFO ] 2023-06-01 07:36:38.495 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"657e1cef-7348-4b18-b15a-b5322b5aca92", :path=>"/usr/share/logstash/data/uuid"}
[INFO ] 2023-06-01 07:36:40.911 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2023-06-01 07:36:41.640 [Converge PipelineAction::Create<http_input>] Reflections - Reflections took 473 ms to scan 1 urls, producing 127 keys and 444 values
[INFO ] 2023-06-01 07:36:43.316 [Converge PipelineAction::Create<http_input>] javapipeline - Pipeline `http_input` is configured with `pipeline.ecs_compatibility: disabled` setting. All plugins in this pipeline will default to `ecs_compatibility => disabled` unless explicitly configured otherwise.
[INFO ] 2023-06-01 07:36:43.338 [[http_input]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
[INFO ] 2023-06-01 07:36:43.738 [[http_input]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[WARN ] 2023-06-01 07:36:44.124 [[http_input]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[INFO ] 2023-06-01 07:36:44.169 [[http_input]-pipeline-manager] elasticsearch - Elasticsearch version determined (8.6.2) {:es_version=>8}
[WARN ] 2023-06-01 07:36:44.169 [[http_input]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[INFO ] 2023-06-01 07:36:44.206 [[http_input]-pipeline-manager] elasticsearch - Not eligible for data streams because ecs_compatibility is not enabled. Elasticsearch data streams require that events adhere to the Elastic Common Schema. While `ecs_compatibility` can be set for this individual Elasticsearch output plugin, doing so will not fix schema conflicts caused by upstream plugins in your pipeline. To avoid mapping conflicts, you will need to use ECS-compatible field names and datatypes throughout your pipeline. Many plugins support an `ecs_compatibility` mode, and the `pipeline.ecs_compatibility` setting can be used to opt-in for all plugins in a pipeline.
[INFO ] 2023-06-01 07:36:44.206 [[http_input]-pipeline-manager] elasticsearch - Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ] 2023-06-01 07:36:44.257 [Ruby-0-Thread-11: /usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.12.4-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:161] elasticsearch - Using mapping template from {:path=>"/etc/logstash/tpot-template.json"}
[INFO ] 2023-06-01 07:36:44.298 [[http_input]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"http_input", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/http_input.conf"], :thread=>"#<Thread:0x23b1011f@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:131 run>"}
[INFO ] 2023-06-01 07:36:44.301 [Ruby-0-Thread-11: /usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.12.4-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:161] elasticsearch - Installing Elasticsearch template {:name=>"logstash"}
[INFO ] 2023-06-01 07:36:46.742 [[http_input]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>2.44}
[INFO ] 2023-06-01 07:36:47.444 [[http_input]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"http_input"}
[INFO ] 2023-06-01 07:36:47.448 [[http_input]<http] http - Starting http input listener {:address=>"0.0.0.0:64305", :ssl=>"false"}
[INFO ] 2023-06-01 07:36:55.370 [Converge PipelineAction::Create<logstash>] javapipeline - Pipeline `logstash` is configured with `pipeline.ecs_compatibility: disabled` setting. All plugins in this pipeline will default to `ecs_compatibility => disabled` unless explicitly configured otherwise.
[INFO ] 2023-06-01 07:36:55.378 [[logstash]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
[INFO ] 2023-06-01 07:36:55.386 [[logstash]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[WARN ] 2023-06-01 07:36:55.413 [[logstash]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[INFO ] 2023-06-01 07:36:55.422 [[logstash]-pipeline-manager] elasticsearch - Elasticsearch version determined (8.6.2) {:es_version=>8}
[WARN ] 2023-06-01 07:36:55.422 [[logstash]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[INFO ] 2023-06-01 07:36:55.438 [[logstash]-pipeline-manager] elasticsearch - Not eligible for data streams because ecs_compatibility is not enabled. Elasticsearch data streams require that events adhere to the Elastic Common Schema. While `ecs_compatibility` can be set for this individual Elasticsearch output plugin, doing so will not fix schema conflicts caused by upstream plugins in your pipeline. To avoid mapping conflicts, you will need to use ECS-compatible field names and datatypes throughout your pipeline. Many plugins support an `ecs_compatibility` mode, and the `pipeline.ecs_compatibility` setting can be used to opt-in for all plugins in a pipeline.
[INFO ] 2023-06-01 07:36:55.438 [[logstash]-pipeline-manager] elasticsearch - Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ] 2023-06-01 07:36:55.478 [Ruby-0-Thread-28: /usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.12.4-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:161] elasticsearch - Using mapping template from {:path=>"/etc/logstash/tpot-template.json"}
[INFO ] 2023-06-01 07:36:55.492 [Ruby-0-Thread-28: /usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.12.4-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:161] elasticsearch - Installing Elasticsearch template {:name=>"logstash"}
[INFO ] 2023-06-01 07:37:03.200 [[logstash]-pipeline-manager] downloadmanager - new database version detected? true
[INFO ] 2023-06-01 07:37:11.029 [[logstash]-pipeline-manager] databasemanager - By not manually configuring a database path with `database =>`, you accepted and agreed MaxMind EULA. For more details please visit https://www.maxmind.com/en/geolite2/eula
[INFO ] 2023-06-01 07:37:11.030 [[logstash]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/data/plugins/filters/geoip/1685605021/GeoLite2-City.mmdb"}
[INFO ] 2023-06-01 07:37:17.868 [[logstash]-pipeline-manager] databasemanager - By not manually configuring a database path with `database =>`, you accepted and agreed MaxMind EULA. For more details please visit https://www.maxmind.com/en/geolite2/eula
[INFO ] 2023-06-01 07:37:17.869 [[logstash]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/data/plugins/filters/geoip/1685605021/GeoLite2-City.mmdb"}
[INFO ] 2023-06-01 07:37:18.154 [[logstash]-pipeline-manager] databasemanager - By not manually configuring a database path with `database =>`, you accepted and agreed MaxMind EULA. For more details please visit https://www.maxmind.com/en/geolite2/eula
[INFO ] 2023-06-01 07:37:18.154 [[logstash]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/data/plugins/filters/geoip/1685605021/GeoLite2-City.mmdb"}
[INFO ] 2023-06-01 07:37:18.339 [[logstash]-pipeline-manager] databasemanager - By not manually configuring a database path with `database =>`, you accepted and agreed MaxMind EULA. For more details please visit https://www.maxmind.com/en/geolite2/eula
[INFO ] 2023-06-01 07:37:18.339 [[logstash]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/data/plugins/filters/geoip/1685605021/GeoLite2-City.mmdb"}
[INFO ] 2023-06-01 07:37:18.431 [[logstash]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"logstash", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/logstash.conf"], :thread=>"#<Thread:0x56ca1180@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:131 run>"}
[INFO ] 2023-06-01 07:37:21.939 [[logstash]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>3.51}
[INFO ] 2023-06-01 07:37:21.960 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_d7f6db101f39dd3bc7af0869e32d632e", :path=>["/data/suricata/log/eve.json"]}
[INFO ] 2023-06-01 07:37:21.964 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_1026fa2fde81810a4e5056140cc7568d", :path=>["/data/p0f/log/p0f.json"]}
[INFO ] 2023-06-01 07:37:21.966 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_d33b774f24f61be8d919a2012cbe9abd", :path=>["/data/log4pot/log/log4pot.log"]}
[INFO ] 2023-06-01 07:37:21.969 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_557975a3dcea1b8623e0586baddbccb1", :path=>["/data/conpot/log/*.json"]}
[INFO ] 2023-06-01 07:37:21.971 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_995e5329a4b9fe42dd94767ef9823d8f", :path=>["/data/redishoneypot/log/redishoneypot.log"]}
[INFO ] 2023-06-01 07:37:21.976 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_986ec081f66fb23ed69e7fa55ee8f61f", :path=>["/data/elasticpot/log/elasticpot.json"]}
[INFO ] 2023-06-01 07:37:21.979 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_bc3783e780f8e9480857e9d04717ef7f", :path=>["/data/honeypots/log/*.log"]}
[INFO ] 2023-06-01 07:37:21.986 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_9b0b8f934f6f0765fe575fa0d2057b19", :path=>["/data/ddospot/log/*.log"]}
[INFO ] 2023-06-01 07:37:22.003 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_f479bb3e1164e1f7606aba5fc0416289", :path=>["/data/mailoney/log/commands.log"]}
[INFO ] 2023-06-01 07:37:22.009 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_68c271f21bee8a287cb907502c9137e6", :path=>["/data/ipphoney/log/ipphoney.json"]}
[INFO ] 2023-06-01 07:37:22.012 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_9cabca542263e0829385b1a1f64c2cdc", :path=>["/data/tanner/log/tanner_report.json"]}
[INFO ] 2023-06-01 07:37:22.022 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_d53ab0da823a69e98a460497444fada0", :path=>["/data/adbhoney/log/adbhoney.json"]}
[INFO ] 2023-06-01 07:37:22.034 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_74242023f3037eec395f8e8f6740c1c1", :path=>["/data/medpot/log/medpot.log"]}
[INFO ] 2023-06-01 07:37:22.036 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_8fd3005346b9f643cbbcaa7aac8943c5", :path=>["/data/dicompot/log/dicompot.log"]}
[INFO ] 2023-06-01 07:37:22.039 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_6b32952db7d4ce2293515b24646e9124", :path=>["/data/nginx/log/access.log"]}
[INFO ] 2023-06-01 07:37:22.041 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_7d325a88d98de788a92d4e863363ac5b", :path=>["/data/endlessh/log/endlessh.log"]}
[INFO ] 2023-06-01 07:37:22.044 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_86714a3f24339ae572d6101984ae14df", :path=>["/data/cowrie/log/cowrie.json"]}
[INFO ] 2023-06-01 07:37:22.046 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_9c9181627777bac1d8dedabee7e99a19", :path=>["/data/honeytrap/log/attackers.json"]}
[INFO ] 2023-06-01 07:37:22.054 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_0c047838183ad567bc5704e2e098d4e3", :path=>["/data/ciscoasa/log/ciscoasa.log"]}
[INFO ] 2023-06-01 07:37:22.058 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_cc5e603411a2cb87cf3f58493e010676", :path=>["/data/fatt/log/fatt.log"]}
[INFO ] 2023-06-01 07:37:22.061 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_aed504a38d7c2fc586f841c2690983f5", :path=>["/data/glutton/log/glutton.log"]}
[INFO ] 2023-06-01 07:37:22.068 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_cb0a7058a1c046ef5aa0e7a86c033ab1", :path=>["/data/citrixhoneypot/logs/server.log"]}
[INFO ] 2023-06-01 07:37:22.073 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_0be0aff8c4d7ed0edd704305e069a3f1", :path=>["/data/hellpot/log/hellpot.log"]}
[INFO ] 2023-06-01 07:37:22.077 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_ffebddcd4a749032e3c9bdabe6f463a7", :path=>["/data/sentrypeer/log/sentrypeer.json"]}
[INFO ] 2023-06-01 07:37:22.080 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_42bde8dd174eb2012ccaedcd320d8b1f", :path=>["/data/dionaea/log/dionaea.json"]}
[INFO ] 2023-06-01 07:37:22.088 [[logstash]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_eb16d0b601e0d213c6e754929e511031", :path=>["/data/heralding/log/auth.csv"]}
[INFO ] 2023-06-01 07:37:22.095 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.109 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.111 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.126 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.129 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.125 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.132 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.135 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.145 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.141 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.160 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.170 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.176 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.178 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.191 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.202 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.205 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.211 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.219 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.239 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.255 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.261 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.273 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.271 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.281 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.288 [[logstash]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"logstash"}
[INFO ] 2023-06-01 07:37:22.290 [[logstash]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2023-06-01 07:37:22.311 [Agent thread] agent - Pipelines running {:count=>2, :running_pipelines=>[:http_input, :logstash], :non_running_pipelines=>[]}
[ERROR] 2023-06-01 07:39:15.681 [[logstash]<file] json - JSON parse error, original data now in message field {:message=>"Unrecognized token 'Flushing': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')\n at [Source: (String)\"Flushing information for 1 IP(s) to database...\"; line: 1, column: 9]", :exception=>LogStash::Json::ParserError, :data=>"Flushing information for 1 IP(s) to database..."}
t3chn0m4g3 commented 1 year ago

@vordenken This is unrelated. Startup is fine, error is related to non-parseable field.

vordenken commented 1 year ago

Oh okay... Any idea how to fix this? It started a few days ago and tpot is unusable now. CPU runs on 100% all the time, running out of memory sometimes and restarting sometimes helps but only for a few minutes at most.

t3chn0m4g3 commented 1 year ago

If it is really running out of memory then you should adjust the tpot.yml and give it more RAM. Please review the issues and discussions, this has been discussed quite often.

vordenken commented 1 year ago

I have searched the issues already but didn't find anything but this thread. It seems not to be related but an issue nonetheless. I gave the server more ram (16 to 24GB) but its already near oom again:

Bildschirmfoto 2023-06-01 um 11 45 02
t3chn0m4g3 commented 1 year ago

Check out the screenshot. DDospot seems to take up lots of system resources and as such probably writing tons of logs. Stop ddospot or remove it from the tpot.yml to check if things improve. But it will take some time for logstash to ingest all the existing logs.