Yamato-Security / hayabusa

Hayabusa (隼) is a sigma-based threat hunting and fast forensics timeline generator for Windows event logs.
GNU Affero General Public License v3.0
2.32k stars 203 forks source link

Sending logs remotely via filebeats #951

Open YamatoSecurity opened 1 year ago

YamatoSecurity commented 1 year ago

At first I was thinking of creating a new alert-elastic command to natively send logs to a central Elastic Stack SIEM but the elastic-rs Rust client does not seem to be maintained and is still in Alpha so we should probably rely on Filebeats to send the logs instead. Filebeats will allow better compatibility and stability with Logstash and allow for throttling, etc...

For the first phase, we can just focus on sending the logs one time, but after that it would be nice to be able to do periodic threat hunting scans and only send the alerts that differ.

Here is the notes I wrote up about how we might implement this in the future:

If the alerts are sent successfully then the following information is written to the sent-alerts.csv file. (If the file already exists, then the file is updated) Timestamp (Original ISO-8601 timestamp), Rule ID, Rule Title Before Hayabusa sends the alerts, it checks the Timestamp and Rule ID in this file. If the timestamp and rule ID match, then we assume that the alert was previously sent and the alert does not get sent again. The Rule Title is not needed for checking as the title may be changed over time, however, I want to save it to help out when debugging.

hitenkoku commented 1 year ago

@YamatoSecurity Thank you for raising the issue. I basically don't see the problem, but I am concerned that the size of the csv file is getting bigger and bigger each time I do it.

For example, how about writing only the latest timestamp in the File and deeming anything before that time to have been sent?

YamatoSecurity commented 1 year ago

@hitenkoku Thank you for your comment! Indeed, the CSV file will get bigger and bigger over time but should not double the log size. What about keeping the file compressed? That will reduce the file by 10x or more. Maybe we can use something like rust-brotli or brotlic ?

The problem with just writing the latest timestamp and scanning recent logs is that Hayabusa won't be able to find past incidents (perform threat hunting) which is the main goal. It would just become an inferior host IDS which I couldn't really recommend to use.

YamatoSecurity commented 1 year ago

How about we also not include the Rule Title in the CSV file which will save space. I thought it might be useful for debugging but now that I think about it we probably do not need it. If it is just timestamps and rule IDs, the file should not get too big. And if we compress it, it should become pretty small.

hitenkoku commented 1 year ago

Thanks for the comment.

I think Timestamp and Rule ID are sufficient for the contents of the csv.

I will try to create one.

hitenkoku commented 1 year ago

@YamatoSecurity Would it be correct to sort the following config in alphabetical order of the long option?

Elastic Settings:
      --strict               strict mode: do not only warn, but abort if an error occurs
  -i, --index <INDEX_NAME>   name of the elasticsearch index
  -H, --host <HOST>          server name or IP address of elasticsearch server
  -P, --port <PORT>          API port number of elasticsearch server [default: 9200]
      --proto <PROTOCOL>     protocol to be used to connect to elasticsearch [default: https] [possible values: http, https]
  -k, --insecure             omit certificate validation
  -U, --username <USERNAME>  username for elasticsearch server [default: elastic]
  -W, --password <PASSWORD>  password for authenticating at elasticsearch 
YamatoSecurity commented 1 year ago

@hitenkoku Yes, let's organize this in alphabetical order of the long options:

Elastic Settings:
  -H, --host <HOST>          server name or IP address of elasticsearch server
  -i, --index <INDEX_NAME>   name of the elasticsearch index
  -k, --insecure             omit certificate validation
  -W, --password <PASSWORD>  password for authenticating at elasticsearch 
  -P, --port <PORT>          API port number of elasticsearch server [default: 9200]
      --proto <PROTOCOL>     protocol to be used to connect to elasticsearch [default: https] [possible values: http, https]
      --strict               strict mode: do not only warn, but abort if an error occurs
  -U, --username <USERNAME>  username for elasticsearch server [default: elastic]
YamatoSecurity commented 1 year ago

This is going to take a while to test so I changed the milestone to 2.4.0. Maybe release next month in April?

fukusuket commented 2 months ago

Filebeats

fukusuket commented 2 months ago

Logstash

fukusuket commented 1 month ago

Install Elasticsearch (Ubuntu 22.04 LTS)

  1. https://www.elastic.co/guide/en/elasticsearch/reference/current/targz.html#install-linux
  2. https://www.elastic.co/guide/en/elasticsearch/reference/current/targz.html#targz-running
  3. https://qiita.com/nobuhikosekiya/items/7441186795b3da998e2f