telekom-security / tpotce

🍯 T-Pot - The All In One Multi Honeypot Platform 🐝
GNU General Public License v3.0
6.69k stars 1.07k forks source link

Distributed Honeypot Collector #41

Closed wh1t3-n01s3 closed 8 years ago

wh1t3-n01s3 commented 8 years ago

Being able to deploy multiple TPot's and have them send logs to a central collector or aggregator for visualization would be nice. HPFeeds does not appear to retain the same level of integrity as the events visible in the ELK stack, maybe add an optional Splunk Docker container to have a Splunk Forwarder send events to a Splunk server or some other method.

t3chn0m4g3 commented 8 years ago

Thanks for your feedback. Currently development is focused around ELK. As far as I know Splunk is not open source and thus no candidate for a publicly available docker image. Conpot, Honeytrap and Dionaea will get native JSON support allowing for even better integration in ELK.

wh1t3-n01s3 commented 8 years ago

Would it break anything if we set the Docker containers to use the native Docker logging available in Splunk? For example, using a nginx container, you can run 'docker run --publish 80:80 --log-driver=splunk --log-opt splunk-token=99E16DCD-E064-4D74-BBDA-E88CE902F600 --log-opt splunk-url=https://192.168.1.123:8088 --log-opt splunk-insecureskipverify=true nginx' and it will forward logs from the nginx container into the search head specified when NGINX is launched. If it works like I think it would, I would simply need to modify the startup scripts for each docker container which would then forward logs into both Splunk and ELK.

t3chn0m4g3 commented 8 years ago

This should probably work and not break anything. Just modify the startup scripts accordingly. Is the log-driver=splunk suitable for ELK, too? The ELK / Splunk receiver should listen on a different device, though to avoid any port conflicts. Also check out the 16.10 branch which is currently in development since we will be switching to SystemD / Ubuntu 16.04. I will keep the issue open for the next weeks, so you can post about your experience, guess others might be interested also.

wh1t3-n01s3 commented 8 years ago

For ELK (Logstash) you would want to use the log-driver=gelf per the Docker documentation here: https://docs.docker.com/engine/admin/logging/overview/

t3chn0m4g3 commented 8 years ago

I do not think that you will receive the output you want, since all relevant logging information is stored within the container by supervisord. The only information you will receive is the start / stop information regarding the supervised programs within the container.

Redirecting all the outputs to supervisor is mandatory as well as enable the debug mode, here an example for ELK:

[supervisord]
nodaemon=true
loglevel=debug

[program:elasticsearch]
redirect_stderr=true
redirect_stdout=true
command=/usr/share/elasticsearch/bin/elasticsearch
user=tpot
autorestart=true

[program:kibana]
redirect_stderr=true
redirect_stdout=true
command=/opt/kibana/bin/kibana
autorestart=true

[program:logstash]
redirect_stderr=true
redirect_stdout=true
command=/opt/logstash/bin/logstash agent -f /etc/logstash/conf.d/logstash.conf
autorestart=true
wh1t3-n01s3 commented 8 years ago

You are correct, the best solution I've concocted so far is to use the Splunk container at https://hub.docker.com/r/outcoldman/splunk/ and then mount the /data folder into Splunk and monitor the files specified in the ELK config at https://github.com/dtag-dev-sec/elk/blob/master/logstash.conf. So far I've only experimented with the JSON output files, but Splunk indexes them perfectly so I believe this to be the most streamlined method without compromising community submission or breaking the ELK stack.

t3chn0m4g3 commented 8 years ago

For 16.10 I will implement a solution that should fit your needs, as well as others. I will split the conf.d/logstash.conf into its different services (cowrie.conf, dionaea.conf, ...) and upon start in supervisord I will check if dedicated config files do exist in /data/elk/logstash/conf and copy these into the container. You would only need to copy the logstash configs to that folder and can per service decide what logstash should do with it.

t3chn0m4g3 commented 8 years ago

Merged to ELK 16.10