Closed lbunk closed 7 years ago
@lbunk Found any solution? I have faced similar issue with my installation as well.
@maadjahangir Yes, I found a solution. However, I still don't know why my posted configuration results in that strange behaviour.
Add the following statements to your configuration and play a bit with the values. (Warning: With this values the mailbox will be heavily spammed!)
group_by: ['service']
group_wait: 30s
group_interval: 0s
repeat_interval: 0s
Please tell me if this works for you too.
Please do not ever set your timer values to 0. It will literally make your CPU spin and be an invalid configuration in future versions. Is setting these from a proper duration to 0 actually what's fixing the issue?
@fabxc Yes, this fixed the issue. But using the statements at all rather than setting the durations to 0 was the solution. With proper durations it also works.
Btw. if you just use the group_interval statement without defining a group, the Alertmanager will send a notification every 5 minutes, regardless of the actual value. (Using this statement without any groups makes no sense. Nevertheless this behaviour is a bit strange.)
I am facing same issue even after adding following statements - global: smtp_smarthost: 'localhost:25' smtp_from: 'alertmanager@example.org' smtp_require_tls: false
templates:
route: group_by: ['alertname'] group_wait: 30s group_interval: 5m repeat_interval: 10m receiver: team-X-mails
receivers:
AlertManger logs looks clean -
Oct 25 14:49:53 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:49:53.53027003Z caller=silence.go:262 component=silences msg="Running maintenance" Oct 25 14:49:53 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:49:53.531204407Z caller=nflog.go:287 component=nflog msg="Running maintenance" Oct 25 14:49:53 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:49:53.535766019Z caller=silence.go:279 component=silences msg="Maintenance done" duration=349.204µs Oct 25 14:49:53 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:49:53.536743689Z caller=nflog.go:304 component=nflog msg="Maintenance done" duration=32.806µs Oct 25 14:57:01 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:57:01.490769519Z caller=main.go:395 msg="Received SIGTERM, exiting gracefully..." Oct 25 14:57:01 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:57:01.491008094Z caller=silence.go:262 component=silences msg="Running maintenance" Oct 25 14:57:01 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:57:01.49191048Z caller=nflog.go:287 component=nflog msg="Running maintenance" Oct 25 14:57:01 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:57:01.495692532Z caller=silence.go:279 component=silences msg="Maintenance done" duration=55.096µs Oct 25 14:57:01 usdf23v0328 alertmanager: level=info ts=2018-10-25T19:57:01.499754381Z caller=nflog.go:304 component=nflog msg="Maintenance done" duration=83.123µs
Hi, I am facing a alerting problem , alertmanager is able to receive the alerts , but failing while sending notifications using google smtp server , reporting too many login attempts
level=error ts=2020-05-01T05:47:24.117Z caller=notify.go:372 component=dispatcher msg="Error on notify" err="*email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. o21sm1168951pgk.16 - gsmtp" context_err="context deadline exceeded"
level=error ts=2020-05-01T05:47:24.117Z caller=dispatch.go:301 component=dispatcher msg="Notify for alerts failed" num_alerts=2 err="email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. h12sm1278948pfq.176 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. a22sm1148669pga.28 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. k12sm1264192pfp.158 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. e11sm1294286pfl.85 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. c124sm1252180pfb.187 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. z15sm1160706pjt.20 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. c80sm1266606pfb.82 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. f99sm1167068pjg.22 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. z190sm1297780pfb.1 - gsmtp; email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. o21sm1168951pgk.16 - gsmtp"
level=error ts=2020-05-01T05:47:24.118Z caller=notify.go:372 component=dispatcher msg="Error on notify" err="*email.loginAuth auth: 454 4.7.0 Too many login attempts, please try again later. h193sm1294970pfe.30 - gsmtp" context_err="context deadline exceeded"
This is my alert-manger configs
global:
resolve_timeout: 2m
smtp_smarthost: 'smtp.gmail.com:587'
smtp_from: '*******'
smtp_auth_username: '*****'
smtp_auth_password: '******'
smtp_require_tls: true
#send_resolved: true
route:
group_by: [alertname]
group_wait: 40s
group_interval: 10s
repeat_interval: 10s
receiver: 'default_receiver ```
Do I need to set some session-timeout properties if available, This problem is happening intermittently (some hours alerts are getting triggered, and for someday some hours it won't send alert )
Any help greatly appreciated
same issue. any help?
alertmanager version: v0.18.0
configuration: `global: resolve_timeout: 5m smtp_smarthost: 'smtp.**.com:25' smtp_from: '@xx.cn' smtp_auth_username: '@xx.cn' smtp_auth_password: '**' smtp_hello: '*****' smtp_require_tls: true route: group_by: ['alertname'] group_wait: 30s group_interval: 5m repeat_interval: 4h receiver: katy receivers:
log : level=debug ts=2020-07-31T06:30:48.244Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-image42f0-mem[9807518][active] level=debug ts=2020-07-31T06:30:50.841Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-java-mem[f390d67][active] level=debug ts=2020-07-31T06:30:55.705Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-image42f0-disk[46e5d35][active] level=debug ts=2020-07-31T06:31:01.656Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-image42f0-mem[8c7b5b4][active] level=debug ts=2020-07-31T06:31:09.463Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-image42f0-mem[4c9328b][active] level=debug ts=2020-07-31T06:31:20.436Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=bp-code2837-net_send[643ebad][active]
I met the similar issue. I can view the alert from alertmanager but can't get the email notification. I tried the same smtp host, sender and receiver from the pod, it can send email successfully.
My alertmanager is v0.21.0 configuration:
global:
resolve_timeout: 1m
smtp_require_tls: false
smtp_smarthost: 'xxx:25'
smtp_from: 'aitest@xxx.com'
route:
group_by: ['alertname']
group_wait: 10s
group_interval: 10s
repeat_interval: 10s
receiver: 'email'
receivers:
- name: 'email'
email_configs:
- to: 'xxx@xxx.com'
from: 'aitest@xxx.com'
smarthost: 'xxx:25'
send_resolved: true
require_tls: false
log:
level=info ts=2021-12-26T04:38:35.159Z caller=cluster.go:693 component=cluster msg="gossip not settled" polls=0 before=0 now=1 elapsed=2.000912585s
level=debug ts=2021-12-26T04:38:37.160Z caller=cluster.go:690 component=cluster msg="gossip looks settled" elapsed=4.001889738s
level=debug ts=2021-12-26T04:38:39.161Z caller=cluster.go:690 component=cluster msg="gossip looks settled" elapsed=6.003078888s
level=debug ts=2021-12-26T04:38:41.162Z caller=cluster.go:690 component=cluster msg="gossip looks settled" elapsed=8.004116474s
level=info ts=2021-12-26T04:38:43.162Z caller=cluster.go:685 component=cluster msg="gossip settled; proceeding" elapsed=10.004257897s
level=debug ts=2021-12-26T04:40:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[ce44fef][active]
level=debug ts=2021-12-26T04:40:17.558Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[9a380fd][active]
level=debug ts=2021-12-26T04:40:17.558Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[f78bceb][active]
level=debug ts=2021-12-26T04:40:17.558Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[d9f8112][active]
level=debug ts=2021-12-26T04:40:17.558Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[06dee13][active]
level=debug ts=2021-12-26T04:40:17.558Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[ce44fef][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:40:17.603Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:40:27.558Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:40:27.606Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:40:37.559Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:40:47.559Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:40:47.622Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:40:57.560Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:07.560Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:07.608Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:41:17.561Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:27.561Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:27.607Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:41:37.562Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:47.563Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:41:47.626Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:41:57.563Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:07.564Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:07.607Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:42:17.555Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[06dee13][active]
level=debug ts=2021-12-26T04:42:17.555Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[ce44fef][active]
level=debug ts=2021-12-26T04:42:17.555Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[9a380fd][active]
level=debug ts=2021-12-26T04:42:17.555Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[f78bceb][active]
level=debug ts=2021-12-26T04:42:17.555Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[d9f8112][active]
level=debug ts=2021-12-26T04:42:17.564Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:27.565Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:27.612Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:42:37.565Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:47.565Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:42:47.607Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:42:57.567Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:07.567Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:07.627Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:43:17.568Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:27.569Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:27.605Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:43:37.570Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:47.570Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:47.623Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:43:57.571Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:43:58.871Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=APIHighRequestLatency95Percentile[cb7c1c3][active]
level=debug ts=2021-12-26T04:44:07.572Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:07.632Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:08.872Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:44:08.916Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:17.556Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[06dee13][active]
level=debug ts=2021-12-26T04:44:17.556Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[ce44fef][active]
level=debug ts=2021-12-26T04:44:17.556Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[9a380fd][active]
level=debug ts=2021-12-26T04:44:17.556Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[f78bceb][active]
level=debug ts=2021-12-26T04:44:17.556Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[d9f8112][active]
level=debug ts=2021-12-26T04:44:17.572Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:18.872Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:44:27.573Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:27.615Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:28.874Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:44:28.917Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:37.574Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:38.875Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:44:47.574Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:47.613Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:48.876Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:44:48.907Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:44:57.575Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:44:58.877Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:07.576Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:07.612Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:08.877Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:08.916Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:17.588Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:18.878Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:27.588Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:27.627Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:28.879Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:28.920Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:37.588Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:38.879Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:47.589Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:47.644Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:48.881Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:45:48.932Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:45:57.589Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:45:58.871Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=APIHighRequestLatency95Percentile[cb7c1c3][active]
level=debug ts=2021-12-26T04:45:58.881Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:46:07.590Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:46:07.631Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:46:08.882Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
level=debug ts=2021-12-26T04:46:08.933Z caller=notify.go:734 component=dispatcher receiver=email integration=email[0] msg="Notify success" attempts=1
level=debug ts=2021-12-26T04:46:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[ce44fef][active]
level=debug ts=2021-12-26T04:46:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[9a380fd][active]
level=debug ts=2021-12-26T04:46:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[f78bceb][active]
level=debug ts=2021-12-26T04:46:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[d9f8112][active]
level=debug ts=2021-12-26T04:46:17.557Z caller=dispatch.go:138 component=dispatcher msg="Received alert" alert=InstanceDown[06dee13][active]
level=debug ts=2021-12-26T04:46:17.591Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"InstanceDown\"}" msg=flushing alerts="[InstanceDown[06dee13][active] InstanceDown[ce44fef][active] InstanceDown[d9f8112][active] InstanceDown[f78bceb][active] InstanceDown[9a380fd][active]]"
level=debug ts=2021-12-26T04:46:18.883Z caller=dispatch.go:474 component=dispatcher aggrGroup="{}:{alertname=\"APIHighRequestLatency95Percentile\"}" msg=flushing alerts=[APIHighRequestLatency95Percentile[cb7c1c3][active]]
Can anyone help?
@yaliqin did you find a solution to this? I'm hitting the same problem.
Yes. The root cause of my problem is the sending email address has some implicit requirements in the gcp staging cluster.
Hi,
I want to get an email notification every time an alert was fired. At the moment, I only receive a single time an email for both alerts, when I just started Prometheus. As the user interface of Prometheus confirms, both alerts are fired continuously. The Alertmanager seems to receive every alert, but doesn't send emails.
I'm using the latest docker images of Prometheus (V1.5.0) and Alertmanager (V0.5.1).
Content of the alert.rules config file:
Content of the alertmanager.yml config file:
Logs of the Alertmanager:
Any help is appreciated.