telekom-security / tpotce

🍯 T-Pot - The All In One Multi Honeypot Platform 🐝
GNU General Public License v3.0
6.87k stars 1.08k forks source link

Log export for SIEM integration #79

Closed SunstriderX closed 6 years ago

SunstriderX commented 7 years ago

Hi t3chn0m4g3,

First of all, a very big congratulations on such an awesome work with the T-Pot. I'm looking into it as a solution to be integrated with a SIEM in the future, but as of now, I just want to be able to export all the logs it generates outside the ELK stack.

I've read issues 41, 50 and 55 which are all related to this, and I haven't been able to get a solution for my scenario to work. I'm using version 16.10, but can't find anything that helps me sending the logs to a third party server/listener (hpfeeds, syslog, etc) Is this possible with the current setup? Maybe even as a feature in the future? edit: I believe what I'm looking for is the syslog plugin for logstash, but even having edited the logstash.conf file logs do not appear to be sent (even after rebooting), so maybe the plugin is not installed? The logstash documentation says it does not come installed by default.

Thanks a lot for your time!

Sunstrider.

t3chn0m4g3 commented 7 years ago

@SunstriderX Thank you :bowtie: You can always map your own configuration file into the docker container. The issues you mentioned will give you a first hint. You will find some examples on how to achieve this within the docker documentaion on Docker volumes

SunstriderX commented 7 years ago

@t3chn0m4g3 Thanks for the reply! I'm not experienced with Docker to be honest, but does this mean I should add a container to the volume to manage the log shipping outside the local ELK? I will dive into the documentation if this is a possible way of doing it. However, I was expecting being able to tell logstash to forward the logs to a syslog server I have listening on the same network. I've tried many different configurations on the output section of the logstash.conf file without success though. Correct me if I'm getting any concept wrong here, any hints will be appreciated too :)

Again, thanks for your time!

mattmac1-zz commented 7 years ago

I just used rsyslog to read the logs in /data/ and send them that way

mattmac1-zz commented 7 years ago

Or filebeat

t3chn0m4g3 commented 7 years ago

@SunstriderX You are on the right track. The easiest way is to modify the logstash.conf and put it in /data/elk/logstash/conf/ on the host and restart the container systemctl stop elk && systemctl start elk. If you want to know if the changes are committed, just compare with ...

docker exec -it elk bash
cat /etc/logstash/conf.d/logstash.conf

Please dig through the logstash output plugin documentation for more information.

@mattmac1: Great choice!

Or filebeat

mattmac1-zz commented 7 years ago

Remember because the docker containers get auto updated any changes you make will get overwritten hence why I did it separately

On 22 Feb 2017 20:38, "Marco Ochse" notifications@github.com wrote:

@SunstriderX https://github.com/SunstriderX You are on the right track. The easiest way is to modify the logstash.conf and put it in /data/elk/logstash/conf/ on the host and restart the container systemctl stop elk && systemctl start elk. If you want to know if the changes are committed, just compare with ...

docker exec -it elk bash cat /etc/logstash/conf.d/logstash.conf

Please dig through the logstash output plugin documentation https://www.elastic.co/guide/en/logstash/current/output-plugins.html for more information.

@mattmac1 https://github.com/mattmac1: Great choice!

Or filebeat https://www.elastic.co/products/beats/filebeat

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/dtag-dev-sec/tpotce/issues/79#issuecomment-281778554, or mute the thread https://github.com/notifications/unsubscribe-auth/AGONpbT0WwfJ0A5xcgfOdlZ5Z3z439sPks5rfI64gaJpZM4MAfWX .

t3chn0m4g3 commented 7 years ago

@mattmac1 Yes and no, the mentioned logstash.conf resides outside the container and will be reloaded into container every time it restarts.

mattmac1-zz commented 7 years ago

Ah didn't notice that. You may want to separate your prod elk instance anyway. So a direct feed from honeypot is a bad idea. Feed it to somewhere in a dmz or the like then from there via kafka etc to your elk instance.

On 22 Feb 2017 20:48, "Marco Ochse" notifications@github.com wrote:

@mattmac1 https://github.com/mattmac1 Yes and no, the mentioned logstash.conf resides outside the container and will be reloaded into container every time it restarts.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/dtag-dev-sec/tpotce/issues/79#issuecomment-281782021, or mute the thread https://github.com/notifications/unsubscribe-auth/AGONpQiasD1BNYmFZNY_JEJTQ-P4JJjnks5rfJEcgaJpZM4MAfWX .

t3chn0m4g3 commented 7 years ago

I'd recommend feeding it to a hardened logstash receiver (dmz) with enabled authentication. From there to the desired destination.

SunstriderX commented 7 years ago

@t3chn0m4g3 @mattmac1 Thanks for your inputs here!

I'm trying to play with the logstash.conf file, but haven't been able to get it working. I've restarted the container, the machine, the receiver, made sure it can receive the logs, etc. My idea is to send (forward) the same data that gets into the local logstash to a rsyslog server I've got listening on the same network.

The flow would look like this: Honeypots ->Logstash -> elasticpot + rsyslog

My output section in logstash.conf looks like this:

# Output section
output {
  elasticsearch {
    hosts => ["localhost:9200"]
}
syslog {
    host => "X.X.X.X"  
    port => 514  
    }
}

Or, let's say I just want to send the Glastopf logs, how about:

# Output section
output {
  elasticsearch {
    hosts => ["localhost:9200"]
}
 if [type] == "Glastopf" {
syslog {
    host => "X.X.X.X"
    port => 514  
    }
}
}

Is there something I'm overseeing? Maybe incorrect syntax?

I'm not sure if I should remove the elasticsearchoutput, but I would rather not, so I can keep an eye at Kibana with all the info, and also "enjoy" the logs on my rsyslog server :)

For an unknown reason, I have seen many many articles on how to feed logstash with syslog, but not the other way around. Correct me if I'm wrong, but it should be possible to do it in the other direction right? (Logstash -> Rsyslog).

On the other hand, I'm not planning to deploy this on a public dmz, and I would look into hardening the system if that was the case, thanks a lot for the heads up!

Thanks a lot for your time again, once I get this working, I'll craft some documentation on it for more people to use it if interested.

SunstriderX commented 7 years ago

Alright, update on what I'm doing in case you are interested. I decided to take the approach suggested by @mattmac1 and query the logs I'm interested in directly from /data and send them to my syslog server. I wasnt receiving the info I was looking for from Logstash (I assume I lack some config and filters there), and rsyslog is doing the trick just fine for my purpose.

Sn00zr commented 7 years ago

Hey @SunstriderX or @mattmac1 , any chance you could share your configuration off rsyslog.conf to be used on the T-Pot host?

Much appreciated!

SunstriderX commented 7 years ago

@Sn00zr Sure thing, I'll simplify it with the lines you need to add.

$ModLoad imfile

$InputFileName /data/suricata/log/fast.log
$InputFileTag SuricataLogs
$InputFileFacility local5
$InputRunFileMonitor
local5.* @XX.XX.XX.XX:YYY 

Where XX.XX.XX.XX is the IP address of your syslog listener and YYY is the destination port (514 by default). Lets say you want to push another log source to your syslog server, you will need to put again all the details. Example with 3 suricata logs sent to your syslog listener:

$ModLoad imfile

$InputFileName /data/suricata/log/fast.log
$InputFileTag SuricataLogs
$InputFileFacility local5
$InputRunFileMonitor
local5.* @XX.XX.XX.XX:YYY 

$InputFileName /data/suricata/log/dns.log
$InputFileTag SuricataDNS
$InputFileFacility local6
$InputRunFileMonitor
local6.* @XX.XX.XX.XX:YYY 

$InputFileName /data/suricata/log/http.log
$InputFileTag SuricataHTTP
$InputFileFacility local7
$InputRunFileMonitor
local7.* @XX.XX.XX.XX:YYY 

You can configure it like this for any log file you can query from the host. In addition, I'm assuming you have properly configured the syslog listener :)

Hope this helped!

Sn00zr commented 7 years ago

Perfect! Thank you!

Sn00zr commented 7 years ago

Just wanted to follow up on this @SunstriderX...

Are you only forwarding those 3 log sources? How about the other HoneyPots? If so, can you paste out each log? Your help is appreciated.

SunstriderX commented 7 years ago

@Sn00zr I did not configure any other log source for my testing with this environment, but you can send whatever syslogs you have. If you are using rsyslog, you can follow the same pattern I mentioned in the .conf file.

From how you are phrasing the question, just a clarification, Suricata is not a honeypot, but an IDS/IPS (configured by default as IDS in the T-Pot). Suricata generates the logs in a consumable format for my syslog server, so that's something you need to take into consideration as well (check the logs of the Honeypots to see if they suit your listener).

Herah commented 7 years ago

Hi all, Related question, do you know where I can get the debug info from the logstash stdout? i.e { stdout { codec => rubydebug} }. I want to see it to troubleshoot the syslog plugin setup but I can't find where it's writing to.

ichintu commented 6 years ago

I am using 17.10 where is the logstash config file? How can i modify logstash so that i can log to a central elasticsearch cluster? Please and thank you.

t3chn0m4g3 commented 6 years ago

@ichintu Resides in the logstash container, docker exec into the container, copy logstash config to the host, make your modifications and map it as a volume to the logstash container.

camaro23 commented 6 years ago

Does anyone have experience with sending the Suricata logs to a Wazuh manager?

dariomad commented 1 year ago

Hi, I am trying to send logs from logstash with the syslog output.

I get an error from syslog plugin which is not installed. how can I install the syslog output plugin to the logstash docker?