Closed erobertt3 closed 4 years ago
Hi @erobertt3,
If I understood you properly, you setup the Wazuh and EFK containers and managed to connect an Agent to them but your Kibana is not showing results.
Let me detail the flow in order to clarify this, if you run our docker-compose.yml
, it will setup the following containers:
In the Wazuh Manager container, it's included: Wazuh Manager + API + Filebeat. It's Filebeat the responsible for reading alerts and forwarding them to ES, then ES will index them and you will be able to visualize them in Kibana.
In order to deploy a quick and simple setup, simply clone our wazuh-docker
repo and run docker-compose up
.
Then you will be able to access Kibana in <your_private_IP>:5601
with the credentials foo:bar
.
Hope it helps
Best regards,
Jose
Thanks for responding @jm404
So, I did actually use the wazuh-docker docker-compose setup, and it set up all of those containers you mentioned. I can access kibana by going to https://localhost and then using foo:bar for authentication,. The kibana port 5601 isn't exposed by default, I did try exposing it and going to it that way but it didn't change anything that I noticed. I am able to access kibana fine and see my client as connected, and can even get some system information by going to the Inventory Data tab, but I can't see the logs from it even though I can see that those logs are populating in the alerts.json file within the wazuh container. I was thinking it could be something to do with filebeat but I'm not sure, are there any changes that have to be made to the filebeat configuration?
Also, I was wondering if filebeat completely replaces Logstash in the ELK stack when dealing with wazuh-docker or if logstash needs to be installed separately for it to work? Thanks
This is my filebeat.yml in case it helps, this definitely could be it, but this is the default file.
# Wazuh - Filebeat configuration file
filebeat.modules:
- module: wazuh
alerts:
enabled: true
archives:
enabled: false
setup.template.json.enabled: true
setup.template.json.path: '/etc/filebeat/wazuh-template.json'
setup.template.json.name: 'wazuh'
setup.template.overwrite: true
setup.ilm.enabled: false
output.elasticsearch.hosts: ['http://elasticsearch:9200']
Hi @erobertt3,
For a deploy with wazuh-docker
Logstash is not required, it should be all setup simply running docker-compose up
I suspect it's related to the vm.max_map_count
setting which sets how much RAM a virtualized service can take.
Please check your containers with docker ps -a
. If you see the Elasticsearch container restarted many times it's almost sure related to the setting mentioned previously.
In order to fix that you can execute sudo sysctl -w vm.max_map_count=262144
in your host and then restart the containers with docker-compose stop
and docker-compose up
If this doesn't solve your error, let me ask you for some additional information about your environment:
Best regards
Jose
Hi, I am fairly new to wazuh and the ELK stack to forgive me if this is a basic question. I wanted to get a node set up quickly to be able to mess around with it and figure out the capabilities of wazuh, so I used the base docker-compose repo to get it running and it seems everything is running fine but when I connect an agent to the wazuh manager I can't see the data from it anywhere in the GUI. I have checked and inside of the wazuh container I can see real time logs in the file /var/ossec/data/logs/alerts/alerts.json. The main thing I am trying to view at the moment is logs from /var/log/messages from my rhel7 machine, and in the default configuration on the agent it seems this is being logged, but I'm wondering how to get it from that alerts.json into elasticsearch and kibana to be processed and visualized. Thanks in advance for any help with this.