Open kalel-k opened 1 year ago
Hi Team,
We are also facing the same issue where Alert manager is not able to send the text only email to the receiver, it is sent in HTML format due to which we are facing Incident management issues. Could some one review this issue and let us know?
Thanks in advance.
Can you share the output from curl <alertmanager host>/api/v2/status | jq -cr '.config.original'
? I don't see the issue with my local setup.
output below its adding html: '{{ template "email.default.html" . }}' even though config file has html: ''
global:
resolve_timeout: 5m
http_config:
follow_redirects: true
smtp_from: k8s.alertmanager@me.com
smtp_hello: localhost
smtp_smarthost: smtp-forwarder
smtp_auth_username: <username>
smtp_auth_password: <secret>
smtp_require_tls: true
pagerduty_url: https://events.pagerduty.com/v2/enqueue
opsgenie_api_url: https://api.opsgenie.com/
wechat_api_url: https://qyapi.weixin.qq.com/cgi-bin/
victorops_api_url: https://alert.victorops.com/integrations/generic/20131114/alert/
route:
receiver: email-receiver
continue: false
routes:
- receiver: ITSM-receiver
match_re:
alertname: .+
severity: warning|critical
continue: true
- receiver: email-receiver
match_re:
severity: warning|critical
continue: false
group_wait: 30s
group_interval: 5m
repeat_interval: 2h
receivers:
- name: email-receiver
email_configs:
- send_resolved: false
to: k8s-team@me.com
from: k8s.alertmanager@me.com
hello: localhost
smarthost: smtp-forwarder:25
auth_username: <username>
auth_password: <secret>
headers:
From: k8s.alertmanager@me.com
Subject: '{{ template "__subject" . }}'
To: k8s-team@me.com
html: '{{ template "email.custom.html" . }}'
require_tls: true
- name: ITSM-receiver
email_configs:
- send_resolved: false
to: monitoringevents@me.com
from: k8s.alertmanager@me.com
hello: localhost
smarthost: auth-forwarder.me.com:25
auth_username: <username>
auth_password: <secret>
headers:
From: k8s.alertmanager@me.com
Subject: '{{ template "__subject" . }}'
To: monitoringevents@me.com
html: '{{ template "email.default.html" . }}'
text: '{{ template "email.custom.txt" . }}'
require_tls: true
templates:
- /etc/alertmanager/configmaps/alert-template/*.tmpl
I've tried with your config file and with html: ""
, I can't see the same issue. Can you double check the configuration file on disk?
Hi,
we have also faced the same issue. Our workaround: define a new blank html template and refer to this new template in your config.
Then you should only receive your email with your desired text.
For example:
{{ define "itsmhtml" }} {{end}}
Hi,
Same issue here, unfortunately if we specify an empty template, an e-mail is sent out with 2 bodies, 1 containing the plaintext context type, and the other empty with the html context type.
Waiting on some news on this, as it is affecting most of our systems :)
Cheers!
@R0peE feel free to share a complete reproducer for the issue. As I said earlier, I tried something locally but couldn't reproduce.
@R0peE
This is how we solve the issue:
Inside the prometheus operator values file we add the followings:
receivers:
- name: 'P1'
email_configs:
- to: 'something@something.com'
send_resolved: false
text: '{{ template "email.default.html" . }}'
And here we define the email.default.html:
receivers.tmpl: |-
{{ define "email.default.html" }}
<!DOCTYPE html>
<html>
<body>
<some text here<br>
<some other text in a new line<br>
</body>
</html>
{{ end }}
@simonpasquier sorry, i maybe have mislead you guys. It seems the issue is only present on our rancher clusters which are using the rancher monitoring operator (essentially: prometheus-operator:v0.59.1)
the issue is if i set the AlertmanagerConfig CR, defining my email receiver with the key/value: html: "" to disable completely html e-mails and just set the text with my template, it seems the prometheus operator just skips this key/value pair, and it will be not present in the generated Alertmanager config, which in the end will cause the html key to revert to its default value: {{ template "email.default.html" . }}
after that the sent out e-mail will contain 2 contexts:
one containing the text based stuff: Content-Type: text/plain; charset="UTF-8"
other the default html template stuff: Content-Type: text/html; charset="UTF-8"
I think on code level it is done here:
https://github.com/prometheus/alertmanager/blob/release-0.24/config/notifiers.go#L36
I have also tried the suggested method of using a completely empty html template, but that also does not work as the e-mail will contain still 2 contexts just one of them is completely empty, but not removed :)
Also as a hint, maybe this can be improved:
https://github.com/prometheus/alertmanager/blob/main/notify/email/email.go#L298
if len(n.conf.HTML) > 0
To add for example a check for a text like "disabled"
if len(n.conf.HTML) > 0 || n.conf.HTML != "disabled"
@R0peE I'd say that your issue is related to the prometheus operator, rather than Alertmanager. Feel free to file an issue at https://github.com/prometheus-operator/prometheus-operator/issues/new?assignees=&labels=kind%2Fbug&template=bug.md
Also as a hint, maybe this can be improved:
https://github.com/prometheus/alertmanager/blob/main/notify/email/email.go#L298
if len(n.conf.HTML) > 0
To add for example a check for a text like "disabled"
if len(n.conf.HTML) > 0 || n.conf.HTML != "disabled"
Adding a sentinel value would be a breaking change.
For the record, the prometheus operator issue is tracked here: https://github.com/prometheus-operator/prometheus-operator/issues/5421
what did you do configured email_config in one receiver to ignore html email by using html: '' so it would only use the text: '{{ template "email.custom.txt" . }}' config
What did you expect to see? Alertmanager not send html emails for that reciever
What did you see instead? Under which circumstances? Alertmanager auto added: html: '{{ template "email.default.html" . }}' replacing html:'' And sent html email
Environment Kubernetes
System information:
insert output of
uname -srm
here Linux 5.4.0-126-generic x86_64Alertmanager version:
insert output of
alertmanager --version
here (repeat for each alertmanager version in your cluster, if relevant to the issue) v0.23.0 and v0.24.0Prometheus version:
insert output of
prometheus --version
here (repeat for each prometheus version in your cluster, if relevant to the issue) v2.33.4Alertmanager configuration file:
email_configs:
email_configs:
/etc/alertmanager/configmaps/alert-template/*.tmpl
Prometheus configuration file:
Logs: