Closed lukelee1987 closed 5 years ago
Have anyone faced this issue before? Its a KOIOSSIAN plugin product onto Kibana to display dashboards.
With this error msg popping out too: "Cannot read property 2 undefined. "
It looks like you might not have any data in Elasticsearch yet.
Hi @robcowart, may I know are you familiar with that? How can I perform test to see if there is any data and how can I trigger the importing of data?
Have you setup filebeat on the system where the Suricata logs are being written?
Filebeat must run on the system where the suricata logs are. An example config is provided.
Filebeat will send data to Logstash, which will process it and send to Elasticsearch.
@robcowart @lukelee1987 Hello, I also have the same issue, and I followed rob's guide to configure the filebeat and suricata, but it always no data in kibana, it seems filebeat cannot send data to logstash, I don't know if some configuration are missing or wrong. @robcowart rob, could you pls help to check this issue, thanks very much.
@robcowart My ELK version is 6.6.2, suricata version is 4.1.3
Hi @robcowart I installed ELK stack 7.1.1 with filebeat and tried installing synesis_lite_suricata using your latest files (v1.1.0) and instructions but now I am getting the same errors as people above when I go to the Dashboard and Suricata: XXX .
"The request for this panel failed. -- The aggregations key is missing from the response, check your permissions for this request."
and
"Error in Visualization -- Cannot read property 2 undefined. "
I have everything installed on 1 VPS and am trying to bring in the suricata logs from the server itself and not from any other external VPS.
In Kibana, when I click on Discover I have 3 indices - filebeat-, suricata- and suricata_stats-. When I select filebeat-, I see logs from /var/log/auth.log, var/log/suricata/eve.json. But when I select suricata- and suricata_stats- it says "No results match your search criteria"
When I look at my /var/log/logstash/logstash-plain.log file I see the following error:
[2019-05-31T18:17:00,386][INFO ][org.logstash.beats.Server] Starting server on port: 5044 [2019-05-31T18:17:06,652][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin. Pipeline_id:synlite_suricata Plugin: <LogStash::Inputs::Beats host=>"0.0.0.0", id=>"input_beats", client_inactivity_timeout=>180, port=>5044, enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_fb23ba7d-5baa-48f5-94f6-f6edabfdca14", enable_metric=>true, charset=>"UTF-8">, ssl=>false, add_hostname=>false, ssl_verify_mode=>"none", ssl_peer_metadata=>false, include_codec_tag=>true, ssl_handshake_timeout=>10000, tls_min_version=>1, tls_max_version=>1.2, cipher_suites=>["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"], executor_threads=>2> Error: Address already in use Exception: Java::JavaNet::BindException Stack: sun.nio.ch.Net.bind0(Native Method)
Do you have any other suggestions for changes to configuration or what I can try to be able to get the logs to come in under the suricata- and suricata_stats- indices and the data to show in the dashboards?
Is there any other information I can provide you to help troubleshoot this better?
Thank you in advance.
It sounds like you have filebeat configured to send data directly to Elasticsearch and not to Logstash. Can you share you filebeat.yml config?
I had originally set the filebeat.yml to send my /var/log/*.log data to Logstash. After I added the configuration for the synesis_lite_suricata I commented out that filebeat.inputs part of the filebeat.yml file to try and troubleshoot the suricata logs coming in but that did not seem to have any affect.
I also tried changing from output.logstash: hosts: ["localhost:5044"]
to: output.logstash: hosts: ["127.0.0.1:5044"]
but that also did not have any affect
Per your request I have attached a copy of my filebeat.yml configuration
Can you confirm that Suricata is actually writing logs to /var/log/suricata/eve.json
?
Yes there are LOTS of entries in the /var/log/suricata/eve.json file.
Also, additional information for you. The Kibana Discover does show entries for the filebeat-* index, but the dashboards that I imported from filebeat are NOT showing any data such as SSH login attempts, Sudo commands, Syslog.
Any other info I can provide you?
Any indices and dashboards directly from Filebeat have nothing to do with this solution.
Are the entries being written live to eve.json (i.e. are they new)? Filebeat remembers what is already read from a file, and will not re-read that data unless its data directory is deleted when it is restarted.
Yes, when I "tail -f /var/log/suricata/eve.json" it is getting current responses logging.
This is my testing VPS. If it is easier and faster for troubleshooting I can provide you with SSH access to the VPS or I am available for a desktop sharing with skype or teamviewer or similar so you can access the VPS. Whichever is most convenient to you.
It looks like the problem I had was 2 different pipelines (filebeat and synesis_lite_suricata) for logstash that were pointing to port 5044.
I had set up input, filter, and output config files for filebeat to send to logstash
/etc/logstash/conf.d/02-beats-input.conf /etc/logstash/conf.d/10-syslog-filter.conf /etc/logstash/conf.d/30-elasticsearch-output.conf
and then I had added in the input, filter, and output config files for synesis_lite at
/etc/logstash/synlite_suricata/conf.d/10_input_beats.logstash.conf /etc/logstash/synlite_suricata/conf.d/20_filter_suricata.logstash.conf /etc/logstash/synlite_suricata/conf.d/30_output_elasticsearch.logstash.conf
and they were conflicting with each other.
So I renamed the filebeat-logstash config files so they would NOT be called up:
/etc/logstash/conf.d/02-beats-input.conf.DISABLED /etc/logstash/conf.d/10-syslog-filter.conf.DISABLED /etc/logstash/conf.d/30-elasticsearch-output.conf.DISABLED
I also commented out the section of the /etc/logstash/pipelines.yml file that calls up the /etc/logstash/conf.d/*.conf lines so the /etc/logstash/pipelines.yml file looks like the following (without the "." in front of each line -- I could not get it to post cleanly without the "."):
. #- pipeline.id: main . # path.config: "/etc/logstash/conf.d/.conf" . . - pipeline.id: synlite_suricata . path.config: "/etc/logstash/synlite_suricata/conf.d/.conf"
Then I restarted all the ELK and suricata services and after a couple of minutes it started bringing in the suricata- and suricata_stats- index data and the dashboards started loading properly.
Thank you for your assistance
Please try the latest release, v1.1.0, with Elastic Stack 7.
Thanks Rob, the ELK-Suricata works fine for me. @netmerchant If you follow the instruction, and the eve.json file has real-time logs, I think Elk-Suricata works fine. you don't need to modify synesis_lite configuration or pipelines files. After install suiricata, you need have to enable suricata IDS, to monitor the traffics from specific interface, then the kibana dashboard will display log datas.
Thanks for the feedback @yangcaixing. I am glad you got it working @netmerchant.
Hi sorry I still need some help. My logstash cannot run properly. It will automatically stop after a few seconds.
Can I use filebeat instead of logstash to collect and transfer the data to ES?
Dear all, I am facing with this issue. "The request for this panel failed. -- The aggregations key is missing from the response, check your permissions for this request."
Kindly support. Thanks !!