Closed erwanlr closed 4 years ago
hpfeeds-logger is the package that normalizes the honeypot logs. That package will need to be updated if you want extra data.
@erwanlr so by default, remote machines won't be able to connect to kibana because it binds to localhost, not it's IP or 0.0.0.0
@erwanlr if you can change the kibana config so that it binds to 0.0.0.0, that'd be great. Everything else looks good and I'll merge once this change is made.
Yes, by default all ELK runs on localhost, and services have to be exposed if needed.
I would not bind Kibana to 0.0.0.0, as there is no login form. It can be configured though it seems (https://www.elastic.co/guide/en/kibana/current/kibana-authentication.html) but I haven't tried that.
Good point. Hmmm. We need to either add an option to bind to 0.0.0.0, or a descriptive notice for users.
Notice added
Beautiful.
Script updated and working.
A few things:
Detected a 6.x and above cluster: the
typeevent field won't be used to determine the document _type {:es_version=>7}
. Not sure where this type is taken from, maybe from the log.[org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.
mhn-json.log
log which is then parsed by Logstash and fed to ES. For instance, even though the below was sent via hp_feedsthe data in
mhn-json.log
was missing the request and user_agent data:So if you have any idea how to be able to get the 'custom' data received by HPFeeds written in the log, please share