Open rajeshj-rpx opened 9 years ago
There's a few things that may cause this.
If you have set realert
, then it can ignore some alerts have the first one is sent. If this was the case, you would see Ignoring match for silenced alert HTTP Response codes
.
If you have aggregation
set, it will save the alert and send it at a future time.
If the alert encounters an exception of some kind. If this had happened, you would probably see a traceback and a line like Uncaught exception running rule ...
.
If none of these are the case, I would need more details to help you diagnose the problem. More log lines surrounding that line and the rule yaml file.
Hi Quentin Long, I have not set any of these, I was using standard settings. I am herewith furnishing the configuration files and logs files for your perusal. I am unable to attch logfiles (might be issue with github), doing a copy paste
rules_folder: /opt/elastalert/conf.d run_every: minutes: 5 buffer_time: minutes: 60 es_host: elasticsearch_hostfqdn es_port: 9200 writeback_index: elastalert_status alert_time_limit: days: 2 smtp_host: 'mysmtpserverfqdn' smtp_port: 587 smtp_ssl: true from_addr: 'notifications@mydomain.com' smtp_auth_file: '/opt/elastalert/smtp_auth_file'
es_host: elasticsearch_hostfqdn es_port: 9200 name: HTTP Response codes type: frequency index: logstash-* num_events: 5 timeframe: hours: 4 filter:
Skipping writing to ES: {'hits': 1040, 'matches': 18, '@timestamp': '2015-09-28T01:26:21.677151Z', 'rule_name': 'HTTP Response codes', 'starttime': '2015-09-28T00:26:20.860720Z', 'endtime': '2015-09-28T01:26:20.860720Z', 'time_taken': 0.8164188861846924} INFO:elastalert:Skipping writing to ES: {'hits': 1040, 'matches': 18, '@timestamp': '2015-09-28T01:26:21.677151Z', 'rule_name': 'HTTP Response codes', 'starttime': '2015-09-28T00:26:20.860720Z', 'endtime': '2015-09-28T01:26:20.860720Z', 'time_taken': 0.8164188861846924} Ran HTTP Response codes from 9-28 0:26 UTC to 9-28 1:26 UTC: 1040 query hits, 18 matches, 0 alerts sent INFO:elastalert:Ran HTTP Response codes from 9-28 0:26 UTC to 9-28 1:26 UTC: 1040 query hits, 18 matches, 0 alerts sent Sleeping for 299 seconds INFO:elastalert:Sleeping for 299 seconds
It appears that you are running elastalert with --debug
. This will cause elastalert to log the alert body to console instead of sending an email. It would have appeared right before that first Skipping writing to ES
message. Try using --verbose
instead of --debug
.
No Luck Quentin Long, This is tailing logs INFO:elastalert:Ignoring match for silenced rule HTTP Response codes Ignoring match for silenced rule HTTP Response codes INFO:elastalert:Ignoring match for silenced rule HTTP Response codes Ran HTTP Response codes from 9-25 15:21 UTC to 9-29 1:44 UTC: 210194 query hits, 42038 matches, 0 alerts sent INFO:elastalert:Ran HTTP Response codes from 9-25 15:21 UTC to 9-29 1:44 UTC: 210194 query hits, 42038 matches, 0 alerts sent Sleeping for 165 seconds INFO:elastalert:Sleeping for 165 seconds Sleeping for 299 seconds INFO:elastalert:Sleeping for 299 seconds
Could you share some working config files, where really mails were received.
There must have been an error that is not in the logs you have shared. Can you run this command and tell me what it returns?
curl localhost:9200/elastalert_status/elastalert_error/_search
Hi Long,
I am herewith attaching redirected results of below command for your perusal.
Regards, Rajesh JVLN From: Quentin Long Reply-To: Yelp/elastalert Date: Wednesday, October 7, 2015 at 7:44 AM To: Yelp/elastalert Cc: rajeshj Subject: Re: [elastalert] Matches found but 0 alerts sent (#256)
There must have been an error that is not in the logs you have shared. Can you run this command and tell me what it returns?
curl localhost:9200/elastalert_status/elastalert_error/_search
— Reply to this email directly or view it on GitHubhttps://github.com/Yelp/elastalert/issues/256#issuecomment-146058517.
{"took":5,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":70,"max_score":1.0,"hits":[{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVADnTOLTEOSl7Xnr_de","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: Connection unexpectedly closed", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 723, in sendmail", " self.rset()", " File \"/usr/lib/python2.7/smtplib.py\", line 462, in rset", " return self.docmd(\"rset\")", " File \"/usr/lib/python2.7/smtplib.py\", line 387, in docmd", " return self.getreply()", " File \"/usr/lib/python2.7/smtplib.py\", line 363, in getreply", " raise SMTPServerDisconnected(\"Connection unexpectedly closed\")", "SMTPServerDisconnected: Connection unexpectedly closed"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-09-25T08:27:25.450339Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVADq6xHTEOSl7XnsEOy","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: {'rjonnalagadda_c@rpxcorp.com': (554, '5.7.1 rjonnalagadda_c@rpxcorp.com: Relay access denied')}", "traceback": ["Traceback (most recent call last):", " File \"/opt/elastalert/elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"/opt/elastalert/elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 735, in sendmail", " raise SMTPRecipientsRefused(senderrs)", "SMTPRecipientsRefused: {'rjonnalagadda_c@rpxcorp.com': (554, '5.7.1 rjonnalagadda_c@rpxcorp.com: Relay access denied')}"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-09-25T08:43:13.863335Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAjlb2yTEOSl7XnPa6V","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: Connection unexpectedly closed", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 723, in sendmail", " self.rset()", " File \"/usr/lib/python2.7/smtplib.py\", line 462, in rset", " return self.docmd(\"rset\")", " File \"/usr/lib/python2.7/smtplib.py\", line 387, in docmd", " return self.getreply()", " File \"/usr/lib/python2.7/smtplib.py\", line 363, in getreply", " raise SMTPServerDisconnected(\"Connection unexpectedly closed\")", "SMTPServerDisconnected: Connection unexpectedly closed"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-10-01T13:27:07.441167Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAjqx4hTEOSl7XnPo9U","_score":1.0,"_source":{"message": "Error while running alert email: Error connecting to SMTP host: [Errno -2] Name or service not known", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 239, in alert", " raise EAException(\"Error connecting to SMTP host: %s\" % (e))", "EAException: Error connecting to SMTP host: [Errno -2] Name or service not known"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-10-01T13:50:28.385057Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAjrAsFTEOSl7XnPpwk","_score":1.0,"_source":{"message": "Error while running alert email: Error connecting to SMTP host: [Errno -2] Name or service not known", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 239, in alert", " raise EAException(\"Error connecting to SMTP host: %s\" % (e))", "EAException: Error connecting to SMTP host: [Errno -2] Name or service not known"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-10-01T13:51:29.029205Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAjsXoTTEOSl7XnPsp3","_score":1.0,"_source":{"message": "Error while running alert email: Error connecting to SMTP host: [Errno -2] Name or service not known", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 239, in alert", " raise EAException(\"Error connecting to SMTP host: %s\" % (e))", "EAException: Error connecting to SMTP host: [Errno -2] Name or service not known"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-10-01T13:57:25.138408Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAjrsAdTEOSl7XnPrNk","_score":1.0,"_source":{"message": "Error while running alert email: Error connecting to SMTP host: [Errno -2] Name or service not known", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 239, in alert", " raise EAException(\"Error connecting to SMTP host: %s\" % (e))", "EAException: Error connecting to SMTP host: [Errno -2] Name or service not known"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-10-01T13:54:26.461314Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVADfSByTEOSl7Xnr1Ko","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: (530, '5.5.1 Authentication Required. Learn more at\n5.5.1 https://support.google.com/mail/answer/14257 a9sm948957qgf.13 - gsmtp', 'ElastAlert')", "traceback": ["Traceback (most recent call last):", " File \"elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 724, in sendmail", " raise SMTPSenderRefused(code, resp, from_addr)", "SMTPSenderRefused: (530, '5.5.1 Authentication Required. Learn more at\n5.5.1 https://support.google.com/mail/answer/14257 a9sm948957qgf.13 - gsmtp', 'ElastAlert')"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-09-25T07:52:23.409505Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAD37d2TEOSl7Xnsc4A","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: Connection unexpectedly closed", "traceback": ["Traceback (most recent call last):", " File \"/opt/elastalert/elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"/opt/elastalert/elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 723, in sendmail", " self.rset()", " File \"/usr/lib/python2.7/smtplib.py\", line 462, in rset", " return self.docmd(\"rset\")", " File \"/usr/lib/python2.7/smtplib.py\", line 387, in docmd", " return self.getreply()", " File \"/usr/lib/python2.7/smtplib.py\", line 363, in getreply", " raise SMTPServerDisconnected(\"Connection unexpectedly closed\")", "SMTPServerDisconnected: Connection unexpectedly closed"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-09-25T09:40:04.597529Z"}},{"_index":"elastalert_status","_type":"elastalert_error","_id":"AVAD6OJuTEOSl7XnshXO","_score":1.0,"_source":{"message": "Uncaught exception running rule HTTP Response codes: Connection unexpectedly closed", "traceback": ["Traceback (most recent call last):", " File \"/opt/elastalert/elastalert/elastalert.py\", line 840, in alert", " return self.send_alert(matches, rule, alert_time=None)", " File \"/opt/elastalert/elastalert/elastalert.py\", line 913, in send_alert", " alert.alert(matches)", " File \"/opt/elastalert/elastalert/alerts.py\", line 242, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib/python2.7/smtplib.py\", line 723, in sendmail", " self.rset()", " File \"/usr/lib/python2.7/smtplib.py\", line 462, in rset", " return self.docmd(\"rset\")", " File \"/usr/lib/python2.7/smtplib.py\", line 387, in docmd", " return self.getreply()", " File \"/usr/lib/python2.7/smtplib.py\", line 363, in getreply", " raise SMTPServerDisconnected(\"Connection unexpectedly closed\")", "SMTPServerDisconnected: Connection unexpectedly closed"], "data": {"rule": "HTTP Response codes"}, "@timestamp": "2015-09-25T09:50:05.422303Z"}}]}}
Hi Rajesh,
I am also having the same problem of getting 0 alerts while having non 0 matches.Can you please tell how did u get it resolved and what was the problem?
Thanks Madhvi
I have configured local mail server proxy to fix this issue. Now I am able to receive alerts after this.
From: madhvi-gupta notifications@github.com<mailto:notifications@github.com> Reply-To: Yelp/elastalert reply@reply.github.com<mailto:reply@reply.github.com> Date: Monday, December 7, 2015 at 10:12 AM To: Yelp/elastalert elastalert@noreply.github.com<mailto:elastalert@noreply.github.com> Cc: rajeshj rjonnalagadda_c@rpxcorp.com<mailto:rjonnalagadda_c@rpxcorp.com> Subject: Re: [elastalert] Matches found but 0 alerts sent (#256)
Hi Rajesh,
I am also having the same problem of getting 0 alerts while having non 0 matches.Can you please tell how did u get it resolved and what was the problem?
Thanks Madhvi
— Reply to this email directly or view it on GitHubhttps://github.com/Yelp/elastalert/issues/256#issuecomment-162411669.
My mail servers are properly configured but as I am getting alerts sometimes but other times it is not sending the alerts while having matches.I am not getting the why it is behaving like this.
Can you share the logs and the rule YAML? If your matches occur right next to each other, it will only send the first unless you add
realert:
minutes: 0
Following is my tailing logs after I run a particular rule
INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ignoring match for silenced rule freq12345 INFO:elastalert:Ignoring match for silenced rule freq12345 Ran freq12345 from 12-7 14:48 IST to 12-7 14:59 IST: 1863 query hits, 405 matches, 0 alerts sent INFO:elastalert:Ran freq12345 from 12-7 14:48 IST to 12-7 14:59 IST: 1863 query hits, 405 matches, 0 alerts sent
As I am new to elastalert I am currently in testing phase.When I run the same rule again then it sends the alert.
Add the realert
setting in my last comment and you should get all the alerts. The default is 1 minute, which prevents it from generating hundreds of alerts at once.
After adding realert to minutes: 0 it was continously sending mails but when explicitly set realert to minutes: 1 then after sending alert once it is not sending again while having matches.
I can see from your logs that in a single minute, there are 405 matches.
Ran freq12345 from 12-7 14:48 IST to 12-7 14:59 IST: 1863 query hits, 405 matches, 0 alerts sent
Unfortunately you cannot control the amount of alerts much more granularly. These 405 matches are all processed one after the other. A realert setting of 0 will send every single one, and a realert setting of 1 minute will send the first and then skip the rest.
One thing you can try is to set aggregation: minutes: 5
for example. This will send all 405 of those matches together into a single email, so that you don't get spammed, but you also don't miss anything.
hi Qmando
I am having the same problem, when I run python -m elastalert.elastalert --verbose --rule example_rules/example_frequency.yaml
this is the message now.
INFO:elastalert:Starting up
INFO:elastalert:Queried rule Example rule DOTs from 2016-03-01 18:45 UTC to 2016-03-01 18:48 UTC: 5 hits
ERROR:root:Traceback (most recent call last):
File "/root/elastalert-master/elastalert/elastalert.py", line 879, in alert
return self.send_alert(matches, rule, alert_time=alert_time)
File "/root/elastalert-master/elastalert/elastalert.py", line 952, in send_alert
alert.alert(matches)
File "elastalert/alerts.py", line 252, in alert
self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())
File "/usr/lib64/python2.7/smtplib.py", line 731, in sendmail
raise SMTPSenderRefused(code, resp, from_addr)
SMTPSenderRefused: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')
ERROR:root:Uncaught exception running rule Example rule DOTs: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')
INFO:elastalert:Rule Example rule DOTs disabled
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ran Example rule DOTs from 2016-03-01 18:45 UTC to 2016-03-01 18:48 UTC: 5 query hits, 2 matches, 0 alerts sent
INFO:elastalert:Sleeping for 56 seconds
INFO:elastalert:Sleeping for 59 seconds
INFO:elastalert:Sleeping for 59 seconds
and if I run curl localhost:9200/elastalert_status/elastalert_error/_search?pretty=true
{
"took" : 40,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 3,
"max_score" : 1.0,
"hits" : [ {
"_index" : "elastalert_status",
"_type" : "elastalert_error",
"_id" : "AVMzeJugn9E3ZsEUE6_E",
"_score" : 1.0,
"_source":{"message": "Uncaught exception running rule Example rule DOTs: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')", "traceback": ["Traceback (most recent call last):", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 879, in alert", " return self.send_alert(matches, rule, alert_time=alert_time)", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 952, in send_alert", " alert.alert(matches)", " File \"elastalert/alerts.py\", line 252, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib64/python2.7/smtplib.py\", line 731, in sendmail", " raise SMTPSenderRefused(code, resp, from_addr)", "SMTPSenderRefused: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')"], "data": {"rule": "Example rule DOTs"}, "@timestamp": "2016-03-01T18:37:35.519593Z"}
}, {
"_index" : "elastalert_status",
"_type" : "elastalert_error",
"_id" : "AVMzf8Syn9E3ZsEUE6_L",
"_score" : 1.0,
"_source":{"message": "Uncaught exception running rule Example rule DOTs: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')", "traceback": ["Traceback (most recent call last):", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 879, in alert", " return self.send_alert(matches, rule, alert_time=alert_time)", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 952, in send_alert", " alert.alert(matches)", " File \"elastalert/alerts.py\", line 252, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib64/python2.7/smtplib.py\", line 731, in sendmail", " raise SMTPSenderRefused(code, resp, from_addr)", "SMTPSenderRefused: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')"], "data": {"rule": "Example rule DOTs"}, "@timestamp": "2016-03-01T18:45:24.785604Z"}
}, {
"_index" : "elastalert_status",
"_type" : "elastalert_error",
"_id" : "AVMzgm6rn9E3ZsEUE6_O",
"_score" : 1.0,
"_source":{"message": "Uncaught exception running rule Example rule DOTs: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')", "traceback": ["Traceback (most recent call last):", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 879, in alert", " return self.send_alert(matches, rule, alert_time=alert_time)", " File \"/root/elastalert-master/elastalert/elastalert.py\", line 952, in send_alert", " alert.alert(matches)", " File \"elastalert/alerts.py\", line 252, in alert", " self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())", " File \"/usr/lib64/python2.7/smtplib.py\", line 731, in sendmail", " raise SMTPSenderRefused(code, resp, from_addr)", "SMTPSenderRefused: (553, '5.5.4 <ElastAlert>... Domain name required for sender address ElastAlert', 'ElastAlert')"], "data": {"rule": "Example rule DOTs"}, "@timestamp": "2016-03-01T18:48:19.370666Z"}
} ]
}
}
here is my example_rules/example_frequency.yaml
file
# Alert when the rate of events exceeds a threshold
# (Optional)
# Elasticsearch host
# es_host: elasticsearch.example.com
# (Optional)
# Elasticsearch port
# es_port: 14900
# (OptionaL) Connect with SSL to elasticsearch
#use_ssl: True
# (Optional) basic-auth username and password for elasticsearch
#es_username: someusername
#es_password: somepassword
# (Required)
# Rule name, must be unique
name: Example rule DOTs
# (Required)
# Type of alert.
# the frequency rule type alerts when num_events events occur with timeframe time
type: frequency
# (Required)
# Index to search, wildcard supported
index: logstash-*
# (Required, frequency specific)
# Alert when this many documents matching the query occur within a timeframe
num_events: 2
# (Required, frequency specific)
# Alert when this many documents matching the query occur within a timeframe
num_events: 2
# (Required, frequency specific)
# num_events must occur within this amount of time to trigger an alert
timeframe:
hours: 2
# (Required)
# A list of elasticsearch filters used for find events
# These filters are joined with AND and nested in a filtered query
# For more info: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl.html
filter:
- term:
type: "jdbc"
# (Required)
# The alert is use when a match is found
alert:
- "email"
# (required, email specific)
# a list of email addresses to send alerts to
email:
- "My-Email-Address@gmail.com"
and here is my config.yaml
# This is the folder that contains the rule yaml files
# Any .yaml file will be loaded as a rule
rules_folder: example_rules
# How often ElastAlert will query elasticsearch
# The unit can be anything from weeks to seconds
run_every:
minutes: 1
# ElastAlert will buffer results from the most recent
# period of time, in case some log sources are not in real time
buffer_time:
minutes: 15
# The elasticsearch hostname for metadata writeback
# Note that every rule can have it's own elasticsearch host
es_host: 127.0.0.1
# The elasticsearch port
es_port: 9200
# Optional URL prefix for elasticsearch
#es_url_prefix: elasticsearch
# Connect with SSL to elasticsearch
#use_ssl: True
# Option basic-auth username and password for elasticsearch
#es_username: someusername
#es_password: somepassword
# The index on es_host which is used for metadata storage
# This can be a unmapped index, but it is recommended that you run
# elastalert-create-index to set a mapping
writeback_index: elastalert_status
# If an alert fails for some reason, ElastAlert will retry
# sending the alert until this time period has elapsed
alert_time_limit:
days: 2
Can you help me?
from_addr: elastalert@example.com
This is something that varies per SMTP server. Some, when given a "from" address without a domain, will append it's own domain. Some, apparently, will return this error. By default, the from_addr
is only "elastalert" without any domain attached.
Sorry new to this, do i need to add it into my example_rules/example_frequency.yaml
like so
# (required, email specific)
# a list of email addresses to send alerts to
email:
from_addr: My-Email-Address@gmail.com
It can go in either config.yaml (applies to all rules) or in individual rules. It's separate from the recipient setting though.
email:
- "My-Email-Address@gmail.com"
from_addr: "My-Email-Address@gmail.com"
and if I wantedd it to go so some email would it look like this?
email:
- "My-Email-Address@gmail.com"
from_addr: "My-Email-Address@gmail.com"
to_addr: "My-Other_Email-Add@gmail.com"
No. to_addr
does not exist. That's what email
does.
OK that worked,
email:
- "My-Email-Address@gmail.com"
from_addr: "My-Email-Address@gmail.com"
Here is the message.
INFO:elastalert:Sleeping for 58 seconds
INFO:elastalert:Queried rule Example rule DOTs from 2016-03-01 19:18 UTC to 2016-03-01 19:33 UTC: 26 hits
INFO:elastalert:Sent email to ['My-Email-Address@gmail.com']
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ignoring match for silenced rule Example rule DOTs
INFO:elastalert:Ran Example rule DOTs from 2016-03-01 19:18 UTC to 2016-03-01 19:33 UTC: 26 query hits, 13 matches, 1 alerts sent
I hope this helps others...
Thanks for your support.
Hi,
I got the same issue here and I tried to add from_addr
but still it doesn't work.
Here is what he said
INFO:elastalert:Ran Example rule from 2016-07-08 23:05 UTC to 2016-07-08 23:52 UTC: 6212 query hits, 1242 matches, 0 alerts sent
INFO:elastalert:Sleeping for 54 seconds
ERROR:root:Error while running alert email: Error connecting to SMTP host: [Errno 111] Connection refused
I don't know why we need to set up where the email from. If I just want to get an alerting email from elastalert, why Rajeshi configure his email like that above:
smtp_host: 'mysmtpserverfqdn'
smtp_port: 587
smtp_ssl: true
from_addr: 'notifications@mydomain.com'
smtp_auth_file: '/opt/elastalert/smtp_auth_file'
Elastalert is getting 5 matches but only sending 1 alert. Here is the log:
/elastalert$ elastalert --verbose --config config.yaml
INFO:elastalert:Starting up
INFO:elastalert:Queried rule Blob mismatches from 2016-12-24 15:02 IST to 2016-12-24 15:06 IST: 0 / 0 hits
INFO:elastalert:Ran Blob mismatches from 2016-12-24 15:02 IST to 2016-12-24 15:06 IST: 0 query hits, 0 matches, 0 alerts sent
INFO:elastalert:Sleeping for 59 seconds
INFO:elastalert:Queried rule Blob mismatches from 2016-12-24 15:02 IST to 2016-12-24 15:07 IST: 5 / 5 hits
INFO:elastalert:Alert sent to Slack
INFO:elastalert:Ignoring match for silenced rule Blob mismatches
INFO:elastalert:Ignoring match for silenced rule Blob mismatches
INFO:elastalert:Ignoring match for silenced rule Blob mismatches
INFO:elastalert:Ignoring match for silenced rule Blob mismatches
INFO:elastalert:Ran Blob mismatches from 2016-12-24 15:02 IST to 2016-12-24 15:07 IST: 5 query hits, 5 matches, 1 alerts sent
INFO:elastalert:Sleeping for 58 seconds
My config.yaml
es_host: localhost
es_port: 9200
rules_folder: rules
run_every:
minutes: 1
buffer_time:
minutes: 5
realert:
minutes: 0
writeback_index: elastalert_status
My rule file
name: Blob mismatches
index: pv-logs
type: any
filter:
- terms:
event.keyword: ["abc", "xyz"]
alert:
- "slack"
slack_webhook_url: 'https://hooks.slack.com/services/<token>'
slack_username_override: 'bot-bot'
What am I missing?
My bad. :weary: :sob: For anyone else who encounter the same problem as me, you need to set the realert rule in each rule's file not in the global config file.
realert:
minutes:0
I noticed that I set the realert period to 12 hours for 2 test websites I was querying. After setting the realert period to 0 minutes, I never got alerts from those websites ever again.
I added a new test website to the config file, and I now get every single alert for the new website, but the realert is still silenced for the older websites.
For some reason, is ElastAlert not updating the realert setting per website?
There is an issue when the rule changes. When you had it set to 12 hours, it will create a document with _type: silence
for that rule. Even after setting it to 0, it will still respect the original 12 hour stash. This is a known issue. A workaround would be to either delete the document from elasticsearch or change the name of the rule slightly.
It appears that you are running elastalert with
--debug
. This will cause elastalert to log the alert body to console instead of sending an email. It would have appeared right before that firstSkipping writing to ES
message. Try using--verbose
instead of--debug
.It is good for help me.
I have setup elastalert and below message is seen INFO:elastalert:Ran HTTP Response codes from 9-25 8:29 UTC to 9-25 8:41 UTC: 591 query hits, 9 matches, 0 alerts sent
Please help us