Azure / Azure-Sentinel

Cloud-native SIEM for intelligent security analytics for your entire enterprise.
https://azure.microsoft.com/en-us/services/azure-sentinel/
MIT License
4.53k stars 2.97k forks source link

microsoft-sentinel-log-analytics-logstash-output-plugin isn't generating sample_file on logstash 8.11.1 #9559

Closed raudiez closed 9 months ago

raudiez commented 10 months ago

Describe the bug sample_file isn't generating with output plugin microsoft-sentinel-log-analytics-logstash-output-plugin on dockerized Logstash version 8.11.1.

To Reproduce Steps to reproduce the behavior:

  1. Install Logstash container from docker.elastic.co/logstash/logstash:8.11.1-amd64.
  2. I have a multipipeline configuration but just to simplify things, you only need the following pipeline configuration:
    
    input{
    syslog{
    port => 10001
    }
    }

filter{ kv{ allow_duplicate_values => false }

mutate{ replace => {"tmpdate" => "%{date} %{time}"} replace => {"category" => "%{type}%{subtype}"} rename => {"devname" => "dev_name"} rename => {"msg" => "event_name"} rename => {"srcip" => "src_ip"} rename => {"srcport" => "src_port"} rename => {"dstip" => "dst_ip"} rename => {"dstport" => "dst_port"} rename => {"service" => "protocol"} rename => {"http_host" => "hostname"} rename => {"http_agent" => "user_agent"} rename => {"user" => "src_user"} rename => {"severity_level" => "severity"} rename => {"level" => "severity"} remove_field => ["message","event","host","log"] }

date{ match => ["tmp_date", "UNIX", "yyyy-MM-dd HH:mm:ss"] target => "@timestamp" remove_field => ["tmp_date","time","date"] } }

output { microsoft-sentinel-log-analytics-logstash-output-plugin{ create_sample_file => true sample_file_path => "/tmp/logstash-samples" }

Just to check that pipeline is working and reciving events.

file{ path => "/tmp/logstash_fortigate.log" } }

3. Also `pipeline.ecs_compatibility: disabled` is configured on general Logstash configuration.
4. Start Logstash container.
5. Send log with logger from the host machine (eg. `logger -n 127.0.0.1 -d -P 10001 -f fortinet.log`).
6. Then you can see that `/tmp/logstash-samples/sample<timestamp>.json` isn't generated, but `/tmp/logstash_fortigate.log` has a log.

**Expected behavior**
The plugin should load without errors and write log to `sample_file_path`.

**Screenshots**

logstash@logstash:~$ ls -l /tmp/logstash-samples/ total 0 logstash@logstash:~$ cat /tmp/logstash_fortigate.log {"protocol":"HTTPS","policyid":"0","dstintf":"vdom1","subtype":"local","type":"traffic","category":"traffic_local","logid":"0001000014","sentbyte":"1247","src_port":"62024","duration":"5","app":"Web Management(HTTPS)","sessionid":"107478","@version":"1","priority":13,"logsource":"vm-SOC-collector-dev-westeu-001","src_ip":"172.16.200.254","trandisp":"noop","srcintf":"port11","rcvdbyte":"1719","dstcountry":"Reserved","timestamp":"Dec 7 14:45:37","action":"server-rst","policytype":"local-in-policy","facility":1,"severity_label":"Notice","facility_label":"user-level","srccountry":"Reserved","appcat":"unscanned","severity":"notice","dst_ip":"172.16.200.2","rcvdpkt":"6","@timestamp":"2019-05-10T11:50:48.000Z","vd":"vdom1","eventtime":"1557514248379911176","srcintfrole":"undefined","dstintfrole":"undefined","dst_port":"443","proto":"6","sentpkt":"5"}


You can also find container logs bellow:

Using bundled JDK: /usr/share/logstash/jdk Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties [2023-12-07T14:45:38,940][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties [2023-12-07T14:45:38,946][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.11.1", "jruby.version"=>"jruby 9.4.2.0 (3.1.0) 2023-03-08 90d2913fda OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-linux]"} [2023-12-07T14:45:38,949][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED] [2023-12-07T14:45:38,960][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"} [2023-12-07T14:45:38,961][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"} [2023-12-07T14:45:39,163][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"3f2def61-4782-403b-9cad-f904eec255f3", :path=>"/usr/share/logstash/data/uuid"} [2023-12-07T14:45:39,849][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false} [2023-12-07T14:45:40,463][INFO ][org.reflections.Reflections] Reflections took 362 ms to scan 1 urls, producing 132 keys and 464 values [2023-12-07T14:45:40,908][INFO ][logstash.javapipeline ] Pipeline pipeline_default is configured with pipeline.ecs_compatibility: disabled setting. All plugins in this pipeline will default to ecs_compatibility => disabled unless explicitly configured otherwise. /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/microsoft-sentinel-log-analytics-logstash-output-plugin-1.1.0/lib/logstash/sentinel_la/logsSender.rb:4: warning: parentheses after method name is interpreted as an argument list, not a decomposed argument /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/microsoft-sentinel-log-analytics-logstash-output-plugin-1.1.0/lib/logstash/sentinel_la/logStashEventsBatcher.rb:3: warning: parentheses after method name is interpreted as an argument list, not a decomposed argument [2023-12-07T14:45:40,967][INFO ][logstash.javapipeline ][pipeline_default] Starting pipeline {:pipeline_id=>"pipeline_default", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/default.conf"], :thread=>"#<Thread:0x62b7784f /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"} [2023-12-07T14:45:41,621][INFO ][logstash.javapipeline ][pipeline_default] Pipeline Java execution initialization time {"seconds"=>0.65} [2023-12-07T14:45:41,651][INFO ][logstash.javapipeline ][pipeline_default] Pipeline started {"pipeline.id"=>"pipeline_default"} /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/microsoft-sentinel-log-analytics-logstash-output-plugin-1.1.0/lib/logstash/sentinel_la/logAnalyticsClient.rb:14: warning: parentheses after method name is interpreted as an argument list, not a decomposed argument /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/microsoft-sentinel-log-analytics-logstash-output-plugin-1.1.0/lib/logstash/sentinel_la/logAnalyticsClient.rb:14: warning: parentheses after method name is interpreted as an argument list, not a decomposed argument /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/microsoft-sentinel-log-analytics-logstash-output-plugin-1.1.0/lib/logstash/sentinel_la/logsSender.rb:5: warning: parentheses after method name is interpreted as an argument list, not a decomposed argument [2023-12-07T14:45:42,824][INFO ][logstash.javapipeline ] Pipeline pipeline_fortigate is configured with pipeline.ecs_compatibility: disabled setting. All plugins in this pipeline will default to ecs_compatibility => disabled unless explicitly configured otherwise. [2023-12-07T14:45:42,855][INFO ][logstash.javapipeline ] Pipeline input_ports is configured with pipeline.ecs_compatibility: disabled setting. All plugins in this pipeline will default to ecs_compatibility => disabled unless explicitly configured otherwise. [2023-12-07T14:45:42,864][INFO ][logstash.outputs.microsoftsentineloutput][pipeline_fortigate] Azure Loganalytics configuration was found valid. [2023-12-07T14:45:42,872][INFO ][logstash.javapipeline ][pipeline_fortigate] Starting pipeline {:pipeline_id=>"pipeline_fortigate", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/fortigate.conf"], :thread=>"#<Thread:0xae97613 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"} [2023-12-07T14:45:42,875][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][input_ports] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2023-12-07T14:45:42,877][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][input_ports] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2023-12-07T14:45:42,903][INFO ][logstash.javapipeline ][input_ports] Starting pipeline {:pipeline_id=>"input_ports", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/config/input.conf"], :thread=>"#<Thread:0x40b81bad /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"} [2023-12-07T14:45:43,169][INFO ][logstash.javapipeline ][pipeline_fortigate] Pipeline Java execution initialization time {"seconds"=>0.29} [2023-12-07T14:45:43,191][INFO ][logstash.javapipeline ][pipeline_fortigate] Pipeline started {"pipeline.id"=>"pipeline_fortigate"} [2023-12-07T14:45:43,481][INFO ][logstash.javapipeline ][input_ports] Pipeline Java execution initialization time {"seconds"=>0.57} [2023-12-07T14:45:43,700][INFO ][logstash.javapipeline ][input_ports] Pipeline started {"pipeline.id"=>"input_ports"} [2023-12-07T14:45:43,715][INFO ][logstash.agent ] Pipelines running {:count=>3, :running_pipelines=>[:pipeline_default, :pipeline_fortigate, :input_ports], :non_running_pipelines=>[]} [2023-12-07T14:45:43,726][INFO ][logstash.inputs.syslog ][input_ports][8db5c352f0a99be5b381bba1c6a2887411c0fa134e57f1ffc588bf3d72d84d80] Starting syslog udp listener {:address=>"0.0.0.0:10000"} [2023-12-07T14:45:43,738][INFO ][logstash.inputs.syslog ][input_ports][33870e6f483a4f49bc86569b68000de88110e62997d33faf2985e973f95e76c1] Starting syslog udp listener {:address=>"0.0.0.0:10001"} [2023-12-07T14:45:43,748][INFO ][logstash.inputs.syslog ][input_ports][8db5c352f0a99be5b381bba1c6a2887411c0fa134e57f1ffc588bf3d72d84d80] Starting syslog tcp listener {:address=>"0.0.0.0:10000"} [2023-12-07T14:45:43,759][INFO ][logstash.inputs.syslog ][input_ports][33870e6f483a4f49bc86569b68000de88110e62997d33faf2985e973f95e76c1] Starting syslog tcp listener {:address=>"0.0.0.0:10001"} [2023-12-07T14:46:07,231][INFO ][logstash.inputs.syslog ][input_ports][33870e6f483a4f49bc86569b68000de88110e62997d33faf2985e973f95e76c1] new connection {:client=>"172.31.0.3:51950"} [2023-12-07T14:46:07,533][INFO ][logstash.outputs.file ][pipeline_fortigate][81d2c34aeaf6419f512be8b58e3b1414170c86af36674aa6133a7bc79c70999b] Opening file {:path=>"/tmp/logstash_fortigate.log"} [2023-12-07T14:46:28,218][INFO ][logstash.outputs.file ][pipeline_fortigate][81d2c34aeaf6419f512be8b58e3b1414170c86af36674aa6133a7bc79c70999b] Closing file /tmp/logstash_fortigate.log



**Desktop (please complete the following information):**
 - Version: 8.11.1.
 - Docker base image: `docker.elastic.co/logstash/logstash:8.11.1-amd64`

 **Additional context**
I've also tried to leave all fields (not removing fields on pipeline's filter step), but it shouldn't have impact on generating the sample field I think.
github-actions[bot] commented 10 months ago

Thank you for submitting an Issue to the Azure Sentinel GitHub repo! You should expect an initial response to your Issue from the team within 5 business days. Note that this response may be delayed during holiday periods. For urgent, production-affecting issues please raise a support ticket via the Azure Portal.

v-sudkharat commented 10 months ago

Hi @raudiez, Thanks for flagging this issue, we are reaching out to the concerned team for this issue, once we receive an update on this, we will update you. Thanks!

v-sudkharat commented 10 months ago

Hi @raudiez, we received the information from our concern team. The sample file is created once at least 10 events are passed to the plugin. If less than 10 events were passed to it, the file will be created on graceful shutdown of the Logstach service\process. so, could you please check that you are sending at least 10 events or shutting down the Logstash process gracefully?

Thanks!

raudiez commented 10 months ago

Hi @v-sudkharat I can confirm that sample file is generated with at least 10 events. Thanks for the information. Could you please indicate this requisite on your procedure? That way people in the future won't get stopped at the same point that I did. Thanks!

v-sudkharat commented 9 months ago

Hi @raudiez, thanks for your confirmation. we have shared your valuable feedback with our concern team. Could you please let us know if your issue has been resolved, can we close this issue from GitHub? Thanks!

v-sudkharat commented 9 months ago

Hi @raudiez, we have received the response from our respective concern team, and they will update the plugin readme for more clarification. we are waiting for your GitHub issue closer confirmation.
Thanks!

raudiez commented 9 months ago

Hi @v-sudkharat yes you can close it.

Thanks!

v-sudkharat commented 9 months ago

@raudiez, thank you for your confirmation.