johnsusek / praeco

Elasticsearch alerting made simple.
GNU General Public License v3.0
548 stars 88 forks source link

Add Dingtalk support #221

Closed kakaNo1 closed 3 years ago

kakaNo1 commented 4 years ago

I noticed that there is no Webhook class on the alarm. Can you support dingtalk? Similar to: https://github.com/xuyaoqiang/elastalert-dingtalk-plugin

image

nsano-rururu commented 4 years ago

It's good to make a request, but there are a few points that are of interest

Has development been completed? .. Last updated 2017 From the feeling of looking at requirements.txt, old ElastAlert will be included. Version not compatible with Elasticsearch7 Suspicious if it works with Python 3 Before embedding in ElastAlert Server, it is necessary to confirm operation with Python 3.6 or later + latest ElastAlert 0.2.4.

requirements.txt

elastalert==0.1.20
elasticsearch>=5.0.0
pyOpenSSL==16.2.0
requests==2.18.1
setuptools>=11.3
nsano-rururu commented 4 years ago

Someone was working with ElastAlert 0.2.1 and Python 3.6, so I can do it by referring to this

filebeat7.5.1+elasticsearch7.5.1+elastAlert0.2.1+elastalert-dingtalk-plugin实现日志监控钉钉告警 https://blog.csdn.net/a807719447/article/details/103929104

nsano-rururu commented 4 years ago

@johnsusek

We apologize for the busy situation, but please set the label "enhancement".

nsano-rururu commented 4 years ago

@kakaNo1

Only Docker version is supported. Prepare and replace the source code corresponding to dingtalk in the ElastAlert body with the Dockerfile of ElastAlert Server. Praeco is thinking that only the dingtalk checkbox is displayed on the screen, and the other settings(dingtalk_webhook、dingtalk_msgtype) are BaseRule.config.

dingtalk_webhook: ""

Is it okay to set it in BaseRule.config similarly to slack and share the setting with other alert rules?

dingtalk_msgtype: "text"

If you know it, please tell me dingtalk_msgtype is only text. Or is there anything else?

kakaNo1 commented 4 years ago

@kakaNo1

Only Docker version is supported. Prepare and replace the source code corresponding to dingtalk in the ElastAlert body with the Dockerfile of ElastAlert Server. Praeco is thinking that only the dingtalk checkbox is displayed on the screen, and the other settings(dingtalk_webhook、dingtalk_msgtype) are BaseRule.config.

dingtalk_webhook: ""

Is it okay to set it in BaseRule.config similarly to slack and share the setting with other alert rules?

dingtalk_msgtype: "text"

If you know it, please tell me dingtalk_msgtype is only text. Or is there anything else?

1,text 2,link 3,markdown 4,ActionCard 5,FeedCard The first three are more commonly used

Detailed document links: https://ding-doc.dingtalk.com/doc#/serverapi2/qf2nxq/d535db33

nsano-rururu commented 4 years ago

@kakaNo1

xuyaoqiang/elastalert-dingtalk-plugin seems to support only msg_type text. I think it's better to make a request for Dingtalk support for ElastAlert issues. Praeco is just a web UI for ElastAlert, so I think it is desirable to support the alert method that exists in ElastAlert itself if possible.

https://github.com/xuyaoqiang/elastalert-dingtalk-plugin/blob/master/elastalert_modules/dingtalk_alert.py line 33-47

    def alert(self, matches):
        headers = {
            "Content-Type": "application/json",
            "Accept": "application/json;charset=utf-8"
        }
        body = self.create_alert_body(matches)
        payload = {
            "msgtype": self.dingtalk_msgtype,
            "text": {
                "content": body
            },
            "at": {
                "isAtAll":False
            }
        }
kakaNo1 commented 4 years ago

This author only provides a simple template, but the nails can set the types I replied to you above, but you need to change the Python template he provided, it does have some uncertainty, it may be more complicated, for you to connect

    def alert(self, matches):
        headers = {
            "Content-Type": "application/json",
            "Accept": "application/json;charset=utf-8"
        }
        time = matches[0]['logdate']
        payload = {
            "msgtype": self.dingtalk_msgtype,
            "markdown": {
               "title": "xxxx",
               "text": "#### **xxxx**\n" +
                       "> **xxx**: {} \n\n".format(index) +
                       "> **xxx**: {} \n\n".format(message) +
                       "> ##### {}发布 [详情](xxxx)@1*******9@1*******9\n".format(time)
            },
            "at": {
                "atMobiles": [
                   "1*******9",
                   "1*******9"
                ],
                "isAtAll": False
            }
        }
kakaNo1 commented 4 years ago

If this is too much trouble, just default text, other types can be supported, but you need to change the Python template yourself?Is that feasible?

nsano-rururu commented 4 years ago

@kakaNo1

If you want to correspond to text, link, markdown, I think it will be the following implementation.

    def alert(self, matches):
        body = self.create_alert_body(matches)

        body = self.format_body(body)
        headers = {'content-type': 'application/json'}
        proxies = {'https': self.dingtalk_proxy} if self.dingtalk_proxy else None
        payload = {
            "msgtype": self.dingtalk_msgtype,
            "text": {
                "content": body
            },
            "link": {
                "text": body,
                "title": self.dingtalk_title,
                "picUrl": self.dingtalk_picurl,
                "messageUrl": self.dingtalk_messageurl
            },
            "markdown": {
                "title": self.dingtalk_title,
                "text": body
            },
            "at": {
                "atMobiles": self.dingtalk_username,
                "isAtAll": self.dingtalk_isAtAll
            }
        }

Part of docker-compose.yml

If you want to change the Python template, you can rewrite the target part of the target file and mount it when starting with docker-compose to overwrite the file(./elastalert/patch/alerts.py) of docker image.

For example:

  elastalert:
    container_name: elastalert
    image: praecoapp/elastalert-server:latest
    ports:
      - 3030:3030
      - 3333:3333
    restart: always
    volumes:
      - ./elastalert/config/elastalert.yaml:/opt/elastalert/config.yaml
      - ./elastalert/config/api.config.json:/opt/elastalert-server/config/config.json
      - ./elastalert/rules:/opt/elastalert/rules
      - ./elastalert/rule_templates:/opt/elastalert/rule_templates
      - ./elastalert/patch/alerts.py:/opt/elastalert/elastalert/alerts.py
nsano-rururu commented 4 years ago

@kakaNo1

Required is dingtalk_webhook_url and dingtalk_msgtype Proposed settings for text, link, markdown

dingtalk_webhook_url dingtalk_username dingtalk_proxy dingtalk_msgtype dingtalk_isAtAll dingtalk_title dingtalk_picurl dingtalk_messageurl

nsano-rururu commented 4 years ago

@kakaNo1

If you support ElastAlert, you will need to add the following code. I have not confirmed the operation yet..

elastalert/alerts.py https://github.com/johnsusek/elastalert-server/tree/master/patches alerts.py

add DingtalkAlerter class


class DingtalkAlerter(Alerter):
     """ Creates a Dingtalk room message for each alert """
    required_options = frozenset(['dingtalk_webhook_url', 'dingtalk_msgtype'])

    def __init__(self, rule):
        super(DingtalkAlerter, self).__init__(rule)
        self.dingtalk_webhook_url = self.rule['dingtalk_webhook_url']
        if isinstance(self.dingtalk_webhook_url, basestring):
            self.dingtalk_webhook_url = [self.dingtalk_webhook_url]
        self.dingtalk_username = self.rule.get('dingtalk_username', '')
        self.dingtalk_proxy = self.rule.get('slack_proxy', None)
        self.dingtalk_msgtype = self.rule.get('dingtalk_msgtype', '')
        self.dingtalk_isAtAll = self.rule.get('dingtalk_isAtAll', 'false')
        self.dingtalk_title = self.rule.get('dingtalk_title', '')
        self.dingtalk_picurl = self.rule.get('dingtalk_picurl', '')
        self.dingtalk_messageurl = self.rule.get('dingtalk_messageurl', '')

    def format_body(self, body):
        body = body.encode('UTF-8')
        return body

    def alert(self, matches):
        body = self.create_alert_body(matches)

        body = self.format_body(body)
        headers = {'content-type': 'application/json'}
        proxies = {'https': self.dingtalk_proxy} if self.dingtalk_proxy else None
        payload = {
            "msgtype": self.dingtalk_msgtype,
            "text": {
                "content": body
            },
            "link": {
                "text": body,
                "title": self.dingtalk_title,
                "picUrl": self.dingtalk_picurl,
                "messageUrl": self.dingtalk_messageurl
            },
            "markdown": {
                "title": self.dingtalk_title,
                "text": body
            },
            "at": {
                "atMobiles": self.dingtalk_username,
                "isAtAll": self.dingtalk_isAtAll
            }
        }

        for url in self.dingtalk_webhook_url:
            try:
                response = requests.post(url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies)
                response.raise_for_status()
            except RequestException as e:
                raise EAException("Error posting to Dingtalk: %s" % e)
        elastalert_logger.info("Alert sent to Dingtalk")

    def get_info(self):
        return {'type': 'dingtalkalerter',
                'dingtalk_webhook_url': self.dingtalk_webhook_url} 

elastalert/loaders.py

https://github.com/johnsusek/elastalert-server/tree/master/patches loaders.py

edit 'zabbix': ZabbixAlerter to 'zabbix': ZabbixAlerter, add 'dingtalkalerter': alerts.DingtalkAlerter

    alerts_mapping = {
        'email': alerts.EmailAlerter,
        'jira': alerts.JiraAlerter,
        'opsgenie': OpsGenieAlerter,
        'stomp': alerts.StompAlerter,
        'debug': alerts.DebugAlerter,
        'command': alerts.CommandAlerter,
        'sns': alerts.SnsAlerter,
        'hipchat': alerts.HipChatAlerter,
        'stride': alerts.StrideAlerter,
        'ms_teams': alerts.MsTeamsAlerter,
        'slack': alerts.SlackAlerter,
        'mattermost': alerts.MattermostAlerter,
        'pagerduty': alerts.PagerDutyAlerter,
        'exotel': alerts.ExotelAlerter,
        'twilio': alerts.TwilioAlerter,
        'victorops': alerts.VictorOpsAlerter,
        'telegram': alerts.TelegramAlerter,
        'googlechat': alerts.GoogleChatAlerter,
        'gitter': alerts.GitterAlerter,
        'servicenow': alerts.ServiceNowAlerter,
        'alerta': alerts.AlertaAlerter,
        'post': alerts.HTTPPostAlerter,
        'hivealerter': alerts.HiveAlerter,
        'linenotify': alerts.LineNotifyAlerter,
        'zabbix': ZabbixAlerter,
        'dingtalkalerter': alerts.DingtalkAlerter
    }

Part of docker-compose.yml

If you want to change the Python template, you can rewrite the target part of the target file and mount it when starting with docker-compose to overwrite the file(./elastalert/patch/alerts.py and ./elastalert/patch/loaders) of docker image.

For example:

  elastalert:
    container_name: elastalert
    image: praecoapp/elastalert-server:latest
    ports:
      - 3030:3030
      - 3333:3333
    restart: always
    volumes:
      - ./elastalert/config/elastalert.yaml:/opt/elastalert/config.yaml
      - ./elastalert/config/api.config.json:/opt/elastalert-server/config/config.json
      - ./elastalert/rules:/opt/elastalert/rules
      - ./elastalert/rule_templates:/opt/elastalert/rule_templates
      - ./elastalert/patch/loaders.py:/opt/elastalert/elastalert/loaders.py
      - ./elastalert/patch/alerts.py:/opt/elastalert/elastalert/alerts.py
kakaNo1 commented 4 years ago

Ok, thanks for your prompt reply. I will try it next

nsano-rururu commented 4 years ago

Required is dingtalk_webhook_url and dingtalk_msgtype No unnecessary settings are required Maybe "dingtalk" was better than "dingtalkalerter". Long name

example

alert:
- "dingtalkalerter"
dingtalk_webhook_url: ''
dingtalk_username: ''
dingtalk_proxy: ''
dingtalk_msgtype: 'text'
dingtalk_isAtAll: 'false'
dingtalk_title: ''
dingtalk_picurl: ''
dingtalk_messageurl: ''
kakaNo1 commented 4 years ago

I tried it today and found an error

version: '3'

services:
  elastalert:
    image: 'praecoapp/elastalert-server'
    ports:
      - 3030:3030
      - 3333:3333
    volumes:
      - ./config/elastalert.yaml:/opt/elastalert/config.yaml
      - ./config/api.config.json:/opt/elastalert-server/config/config.json
      - ./rules:/opt/elastalert/rules
      - ./rule_templates:/opt/elastalert/rule_templates
      - ./dingtalk_alert.py:/opt/elastalert/elastalert/alerts.py
      - ./loaders.py:/opt/elastalert/elastalert/loaders.py
    extra_hosts:
      - 'elasticsearch:172.16.3.188'

  webapp:
    image: 'praecoapp/praeco'
    ports:
      - 8080:8080
    volumes:
      - ./public/praeco.config.json:/var/www/html/praeco.config.json
      - ./nginx_config/nginx.conf:/etc/nginx/nginx.conf
      - ./nginx_config/default.conf:/etc/nginx/conf.d/default.conf

=====

[root@registry praeco]# docker-compose up
Starting praeco_elastalert_1 ... done
Starting praeco_webapp_1     ... done
Attaching to praeco_webapp_1, praeco_elastalert_1
elastalert_1  |
elastalert_1  | > @bitsensor/elastalert@0.0.14 start /opt/elastalert-server
elastalert_1  | > sh ./scripts/start.sh
elastalert_1  |
elastalert_1  | 03:45:06.428Z  INFO elastalert-server: Config:  No config.dev.json file was found in /opt/elastalert-server/config/config.dev.json.
elastalert_1  | 03:45:06.430Z  INFO elastalert-server: Config:  Proceeding to look for normal config file.
elastalert_1  | 03:45:06.440Z  INFO elastalert-server: Config:  A config file was found in /opt/elastalert-server/config/config.json. Using that config.
elastalert_1  | 03:45:06.507Z  INFO elastalert-server: Router:  Listening for GET request on /.
elastalert_1  | 03:45:06.507Z  INFO elastalert-server: Router:  Listening for GET request on /status.
elastalert_1  | 03:45:06.509Z  INFO elastalert-server: Router:  Listening for GET request on /status/errors.
elastalert_1  | 03:45:06.509Z  INFO elastalert-server: Router:  Listening for GET request on /rules.
elastalert_1  | 03:45:06.511Z  INFO elastalert-server: Router:  Listening for GET request on /rules/:id*.
elastalert_1  | 03:45:06.511Z  INFO elastalert-server: Router:  Listening for POST request on /rules/:id*.
elastalert_1  | 03:45:06.511Z  INFO elastalert-server: Router:  Listening for DELETE request on /rules/:id*.
elastalert_1  | 03:45:06.512Z  INFO elastalert-server: Router:  Listening for GET request on /templates.
elastalert_1  | 03:45:06.512Z  INFO elastalert-server: Router:  Listening for GET request on /templates/:id*.
elastalert_1  | 03:45:06.512Z  INFO elastalert-server: Router:  Listening for POST request on /templates/:id*.
elastalert_1  | 03:45:06.512Z  INFO elastalert-server: Router:  Listening for DELETE request on /templates/:id*.
elastalert_1  | 03:45:06.513Z  INFO elastalert-server: Router:  Listening for PUT request on /folders/:type/:path*.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for DELETE request on /folders/:type/:path*.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for POST request on /test.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for POST request on /silence/:path*.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for GET request on /config.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for POST request on /config.
elastalert_1  | 03:45:06.517Z  INFO elastalert-server: Router:  Listening for POST request on /download.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /metadata/elastalert.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /metadata/elastalert_status.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /metadata/silence.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /metadata/elastalert_error.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /metadata/past_elastalert.
elastalert_1  | 03:45:06.518Z  INFO elastalert-server: Router:  Listening for GET request on /indices.
elastalert_1  | 03:45:06.519Z  INFO elastalert-server: Router:  Listening for GET request on /mapping/:index.
elastalert_1  | 03:45:06.519Z  INFO elastalert-server: Router:  Listening for POST request on /search/:index.
elastalert_1  | 03:45:06.519Z  INFO elastalert-server: Router:  Listening for GET request on /config.
elastalert_1  | 03:45:06.527Z  INFO elastalert-server: ProcessController:  Starting ElastAlert
elastalert_1  | 03:45:06.527Z  INFO elastalert-server: ProcessController:  Creating index
elastalert_1  | 03:45:06.926Z  INFO elastalert-server:
elastalert_1  |     ProcessController:  Elastic Version: 7.2.0
elastalert_1  |     Reading Elastic 6 index mappings:
elastalert_1  |     Reading index mapping 'es_mappings/6/silence.json'
elastalert_1  |     Reading index mapping 'es_mappings/6/elastalert_status.json'
elastalert_1  |     Reading index mapping 'es_mappings/6/elastalert.json'
elastalert_1  |     Reading index mapping 'es_mappings/6/past_elastalert.json'
elastalert_1  |     Reading index mapping 'es_mappings/6/elastalert_error.json'
elastalert_1  |     Index praeco_elastalert_status already exists. Skipping index creation.
elastalert_1  |
elastalert_1  | 03:45:06.927Z  INFO elastalert-server: ProcessController:  Index create exited with code 0
elastalert_1  | 03:45:06.927Z  INFO elastalert-server: ProcessController:  Starting elastalert with arguments [none]
elastalert_1  | 03:45:06.940Z  INFO elastalert-server: ProcessController:  Started Elastalert (PID: 37)
elastalert_1  | 03:45:06.943Z  INFO elastalert-server: Server:  Server listening on port 3030
elastalert_1  | 03:45:06.944Z  INFO elastalert-server: Server:  Websocket listening on port 3333
elastalert_1  | 03:45:06.946Z  INFO elastalert-server: Server:  Server started
elastalert_1  | 03:45:07.447Z ERROR elastalert-server:
elastalert_1  |     ProcessController:  Traceback (most recent call last):
elastalert_1  |       File "/usr/lib/python3.8/runpy.py", line 193, in _run_module_as_main
elastalert_1  |         return _run_code(code, main_globals, None,
elastalert_1  |       File "/usr/lib/python3.8/runpy.py", line 86, in _run_code
elastalert_1  |         exec(code, run_globals)
elastalert_1  |       File "/opt/elastalert/elastalert/elastalert.py", line 30, in <module>
elastalert_1  |         from .alerts import DebugAlerter
elastalert_1  |       File "/opt/elastalert/elastalert/alerts.py", line 5, in <module>
elastalert_1  |         reload(sys)
elastalert_1  |     NameError: name 'reload' is not defined
elastalert_1  |
elastalert_1  | 03:45:07.487Z ERROR elastalert-server: ProcessController:  ElastAlert exited with code 1
elastalert_1  | 03:45:07.487Z  INFO elastalert-server: Server:  Stopping server
elastalert_1  | 03:45:07.488Z  INFO elastalert-server: ProcessController:  ElastAlert is not running
elastalert_1  | 03:45:07.488Z  INFO elastalert-server: Server:  Server stopped. Bye!
praeco_elastalert_1 exited with code 0
nsano-rururu commented 4 years ago

@kakaNo1

It seems that there was one blank line between "return body" and "def alert(self, matches):".

before

    def format_body(self, body):
        body = body.encode('UTF-8')
        return body

    def alert(self, matches):
        body = self.create_alert_body(matches)

after

    def format_body(self, body):
        body = body.encode('UTF-8')
        return body

    def alert(self, matches):
        body = self.create_alert_body(matches)
kakaNo1 commented 4 years ago

There is a problem with mounting this path: -./dingtalk_alert.py:/opt/elastalert/elastalert/alerts.py Can we change the name of alerts.py? I responded with an error (from elastalert. Alerts Import Alerter)

elastalert_1  |     ImportError: cannot import name 'Alerter' from partially initialized module 'elastalert.alerts' (most likely due to a circular import) (/opt/elastalert/elastalert/alerts.py)

I think the error is caused by the imported module having the same file name, so I want to change the name

kakaNo1 commented 4 years ago

If I need to change the name of alerts.py in the container, what other changes do I need to make?

kakaNo1 commented 4 years ago

I set up docker-compose.yml

- ./ding1.py:/opt/elastalert/elastalert/ding.py

and set up loaders.py

    alerts_mapping = {
        'email': alerts.EmailAlerter,
        'jira': alerts.JiraAlerter,
        'opsgenie': OpsGenieAlerter,
        'stomp': alerts.StompAlerter,
        'debug': alerts.DebugAlerter,
        'command': alerts.CommandAlerter,
        'sns': alerts.SnsAlerter,
        'hipchat': alerts.HipChatAlerter,
        'stride': alerts.StrideAlerter,
        'ms_teams': alerts.MsTeamsAlerter,
        'slack': alerts.SlackAlerter,
        'mattermost': alerts.MattermostAlerter,
        'pagerduty': alerts.PagerDutyAlerter,
        'exotel': alerts.ExotelAlerter,
        'twilio': alerts.TwilioAlerter,
        'victorops': alerts.VictorOpsAlerter,
        'telegram': alerts.TelegramAlerter,
        'googlechat': alerts.GoogleChatAlerter,
        'gitter': alerts.GitterAlerter,
        'servicenow': alerts.ServiceNowAlerter,
        'alerta': alerts.AlertaAlerter,
        'post': alerts.HTTPPostAlerter,
        'hivealerter': alerts.HiveAlerter,
        'linenotify': alerts.LineNotifyAlerter,
        'zabbix': ZabbixAlerter,
        'dingtalkalerter': ding.DingtalkAlerter
    }

An error occurred at runtime

elastalert_1  | 02:31:55.949Z ERROR elastalert-server:
elastalert_1  |     ProcessController:  Traceback (most recent call last):
elastalert_1  |       File "/usr/lib/python3.8/runpy.py", line 193, in _run_module_as_main
elastalert_1  |         return _run_code(code, main_globals, None,
elastalert_1  |       File "/usr/lib/python3.8/runpy.py", line 86, in _run_code
elastalert_1  |         exec(code, run_globals)
elastalert_1  |       File "/opt/elastalert/elastalert/elastalert.py", line 31, in <module>
elastalert_1  |         from .config import load_conf
elastalert_1  |       File "/opt/elastalert/elastalert/config.py", line 9, in <module>
elastalert_1  |         from . import loaders
elastalert_1  |       File "/opt/elastalert/elastalert/loaders.py", line 31, in <module>
elastalert_1  |         class RulesLoader(object):
elastalert_1  |       File "/opt/elastalert/elastalert/loaders.py", line 84, in RulesLoader
elastalert_1  |         'dingtalkalerter':ding.DingtalkAlerter
elastalert_1  |     NameError: name 'ding' is not defined
nsano-rururu commented 4 years ago

Checking the operation because an error occurs

nsano-rururu commented 4 years ago

キャプチャ

Part of docker-compose.yml

For example:

  elastalert:
    container_name: elastalert
    image: praecoapp/elastalert-server:latest
    ports:
      - 3030:3030
      - 3333:3333
    restart: always
    volumes:
      - ./elastalert/config/elastalert.yaml:/opt/elastalert/config.yaml
      - ./elastalert/config/api.config.json:/opt/elastalert-server/config/config.json
      - ./elastalert/rules:/opt/elastalert/rules
      - ./elastalert/rule_templates:/opt/elastalert/rule_templates
      - ./elastalert/patch/loaders.py:/opt/elastalert/elastalert/loaders.py
      - ./elastalert/patch/alerts.py:/opt/elastalert/elastalert/alerts.py

alert rule example

alert:
  - dingtalk
dingtalk_webhook: 'https://xxxxxxx'
dingtalk_msgtype: 'text'

./elastalert/patch/alerts.py

add DingTalkAlerter class https://github.com/xuyaoqiang/elastalert-dingtalk-plugin

# -*- coding: utf-8 -*-
import copy
import datetime
import json
import logging
import os
import re
import subprocess
import sys
import time
import uuid
import warnings
from email.mime.text import MIMEText
from email.utils import formatdate
from html.parser import HTMLParser
from smtplib import SMTP
from smtplib import SMTP_SSL
from smtplib import SMTPAuthenticationError
from smtplib import SMTPException
from socket import error

import boto3
import requests
import stomp
from exotel import Exotel
from jira.client import JIRA
from jira.exceptions import JIRAError
from requests.auth import HTTPProxyAuth
from requests.exceptions import RequestException
from staticconf.loader import yaml_loader
from texttable import Texttable
from twilio.base.exceptions import TwilioRestException
from twilio.rest import Client as TwilioClient

from .util import EAException
from .util import elastalert_logger
from .util import lookup_es_key
from .util import pretty_ts
from .util import resolve_string
from .util import ts_now
from .util import ts_to_dt

class DateTimeEncoder(json.JSONEncoder):
    def default(self, obj):
        if hasattr(obj, 'isoformat'):
            return obj.isoformat()
        else:
            return json.JSONEncoder.default(self, obj)

class BasicMatchString(object):
    """ Creates a string containing fields in match for the given rule. """

    def __init__(self, rule, match):
        self.rule = rule
        self.match = match

    def _ensure_new_line(self):
        while self.text[-2:] != '\n\n':
            self.text += '\n'

    def _add_custom_alert_text(self):
        missing = self.rule.get('alert_missing_value', '<MISSING VALUE>')
        alert_text = str(self.rule.get('alert_text', ''))
        if 'alert_text_args' in self.rule:
            alert_text_args = self.rule.get('alert_text_args')
            alert_text_values = [lookup_es_key(self.match, arg) for arg in alert_text_args]

            # Support referencing other top-level rule properties
            # This technically may not work if there is a top-level rule property with the same name
            # as an es result key, since it would have been matched in the lookup_es_key call above
            for i, text_value in enumerate(alert_text_values):
                if text_value is None:
                    alert_value = self.rule.get(alert_text_args[i])
                    if alert_value:
                        alert_text_values[i] = alert_value

            alert_text_values = [missing if val is None else val for val in alert_text_values]
            alert_text = alert_text.format(*alert_text_values)
        elif 'alert_text_kw' in self.rule:
            kw = {}
            for name, kw_name in list(self.rule.get('alert_text_kw').items()):
                val = lookup_es_key(self.match, name)

                # Support referencing other top-level rule properties
                # This technically may not work if there is a top-level rule property with the same name
                # as an es result key, since it would have been matched in the lookup_es_key call above
                if val is None:
                    val = self.rule.get(name)

                kw[kw_name] = missing if val is None else val
            alert_text = alert_text.format(**kw)

        self.text += alert_text

    def _add_rule_text(self):
        self.text += self.rule['type'].get_match_str(self.match)

    def _add_top_counts(self):
        for key, counts in list(self.match.items()):
            if key.startswith('top_events_'):
                self.text += '%s:\n' % (key[11:])
                top_events = list(counts.items())

                if not top_events:
                    self.text += 'No events found.\n'
                else:
                    top_events.sort(key=lambda x: x[1], reverse=True)
                    for term, count in top_events:
                        self.text += '%s: %s\n' % (term, count)

                self.text += '\n'

    def _add_match_items(self):
        match_items = list(self.match.items())
        match_items.sort(key=lambda x: x[0])
        for key, value in match_items:
            if key.startswith('top_events_'):
                continue
            value_str = str(value)
            value_str.replace('\\n', '\n')
            if type(value) in [list, dict]:
                try:
                    value_str = self._pretty_print_as_json(value)
                except TypeError:
                    # Non serializable object, fallback to str
                    pass
            self.text += '%s: %s\n' % (key, value_str)

    def _pretty_print_as_json(self, blob):
        try:
            return json.dumps(blob, cls=DateTimeEncoder, sort_keys=True, indent=4, ensure_ascii=False)
        except UnicodeDecodeError:
            # This blob contains non-unicode, so lets pretend it's Latin-1 to show something
            return json.dumps(blob, cls=DateTimeEncoder, sort_keys=True, indent=4, encoding='Latin-1', ensure_ascii=False)

    def __str__(self):
        self.text = ''
        if 'alert_text' not in self.rule:
            self.text += self.rule['name'] + '\n\n'

        self._add_custom_alert_text()
        self._ensure_new_line()
        if self.rule.get('alert_text_type') != 'alert_text_only':
            self._add_rule_text()
            self._ensure_new_line()
            if self.rule.get('top_count_keys'):
                self._add_top_counts()
            if self.rule.get('alert_text_type') != 'exclude_fields':
                self._add_match_items()
        return self.text

class JiraFormattedMatchString(BasicMatchString):
    def _add_match_items(self):
        match_items = dict([(x, y) for x, y in list(self.match.items()) if not x.startswith('top_events_')])
        json_blob = self._pretty_print_as_json(match_items)
        preformatted_text = '{{code}}{0}{{code}}'.format(json_blob)
        self.text += preformatted_text

class Alerter(object):
    """ Base class for types of alerts.

    :param rule: The rule configuration.
    """
    required_options = frozenset([])

    def __init__(self, rule):
        self.rule = rule
        # pipeline object is created by ElastAlerter.send_alert()
        # and attached to each alerters used by a rule before calling alert()
        self.pipeline = None
        self.resolve_rule_references(self.rule)

    def resolve_rule_references(self, root):
        # Support referencing other top-level rule properties to avoid redundant copy/paste
        if type(root) == list:
            # Make a copy since we may be modifying the contents of the structure we're walking
            for i, item in enumerate(copy.copy(root)):
                if type(item) == dict or type(item) == list:
                    self.resolve_rule_references(root[i])
                else:
                    root[i] = self.resolve_rule_reference(item)
        elif type(root) == dict:
            # Make a copy since we may be modifying the contents of the structure we're walking
            for key, value in root.copy().items():
                if type(value) == dict or type(value) == list:
                    self.resolve_rule_references(root[key])
                else:
                    root[key] = self.resolve_rule_reference(value)

    def resolve_rule_reference(self, value):
        strValue = str(value)
        if strValue.startswith('$') and strValue.endswith('$') and strValue[1:-1] in self.rule:
            if type(value) == int:
                return int(self.rule[strValue[1:-1]])
            else:
                return self.rule[strValue[1:-1]]
        else:
            return value

    def alert(self, match):
        """ Send an alert. Match is a dictionary of information about the alert.

        :param match: A dictionary of relevant information to the alert.
        """
        raise NotImplementedError()

    def get_info(self):
        """ Returns a dictionary of data related to this alert. At minimum, this should contain
        a field type corresponding to the type of Alerter. """
        return {'type': 'Unknown'}

    def create_title(self, matches):
        """ Creates custom alert title to be used, e.g. as an e-mail subject or JIRA issue summary.

        :param matches: A list of dictionaries of relevant information to the alert.
        """
        if 'alert_subject' in self.rule:
            return self.create_custom_title(matches)

        return self.create_default_title(matches)

    def create_custom_title(self, matches):
        alert_subject = str(self.rule['alert_subject'])
        alert_subject_max_len = int(self.rule.get('alert_subject_max_len', 2048))

        if 'alert_subject_args' in self.rule:
            alert_subject_args = self.rule['alert_subject_args']
            alert_subject_values = [lookup_es_key(matches[0], arg) for arg in alert_subject_args]

            # Support referencing other top-level rule properties
            # This technically may not work if there is a top-level rule property with the same name
            # as an es result key, since it would have been matched in the lookup_es_key call above
            for i, subject_value in enumerate(alert_subject_values):
                if subject_value is None:
                    alert_value = self.rule.get(alert_subject_args[i])
                    if alert_value:
                        alert_subject_values[i] = alert_value

            missing = self.rule.get('alert_missing_value', '<MISSING VALUE>')
            alert_subject_values = [missing if val is None else val for val in alert_subject_values]
            alert_subject = alert_subject.format(*alert_subject_values)

        if len(alert_subject) > alert_subject_max_len:
            alert_subject = alert_subject[:alert_subject_max_len]

        return alert_subject

    def create_alert_body(self, matches):
        body = self.get_aggregation_summary_text(matches)
        if self.rule.get('alert_text_type') != 'aggregation_summary_only':
            for match in matches:
                body += str(BasicMatchString(self.rule, match))
                # Separate text of aggregated alerts with dashes
                if len(matches) > 1:
                    body += '\n----------------------------------------\n'
        return body

    def get_aggregation_summary_text__maximum_width(self):
        """Get maximum width allowed for summary text."""
        return 80

    def get_aggregation_summary_text(self, matches):
        text = ''
        if 'aggregation' in self.rule and 'summary_table_fields' in self.rule:
            text = self.rule.get('summary_prefix', '')
            summary_table_fields = self.rule['summary_table_fields']
            if not isinstance(summary_table_fields, list):
                summary_table_fields = [summary_table_fields]
            # Include a count aggregation so that we can see at a glance how many of each aggregation_key were encountered
            summary_table_fields_with_count = summary_table_fields + ['count']
            text += "Aggregation resulted in the following data for summary_table_fields ==> {0}:\n\n".format(
                summary_table_fields_with_count
            )
            text_table = Texttable(max_width=self.get_aggregation_summary_text__maximum_width())
            text_table.header(summary_table_fields_with_count)
            # Format all fields as 'text' to avoid long numbers being shown as scientific notation
            text_table.set_cols_dtype(['t' for i in summary_table_fields_with_count])
            match_aggregation = {}

            # Maintain an aggregate count for each unique key encountered in the aggregation period
            for match in matches:
                key_tuple = tuple([str(lookup_es_key(match, key)) for key in summary_table_fields])
                if key_tuple not in match_aggregation:
                    match_aggregation[key_tuple] = 1
                else:
                    match_aggregation[key_tuple] = match_aggregation[key_tuple] + 1
            for keys, count in match_aggregation.items():
                text_table.add_row([key for key in keys] + [count])
            text += text_table.draw() + '\n\n'
            text += self.rule.get('summary_prefix', '')
        return str(text)

    def create_default_title(self, matches):
        return self.rule['name']

    def get_account(self, account_file):
        """ Gets the username and password from an account file.

        :param account_file: Path to the file which contains user and password information.
        It can be either an absolute file path or one that is relative to the given rule.
        """
        if os.path.isabs(account_file):
            account_file_path = account_file
        else:
            account_file_path = os.path.join(os.path.dirname(self.rule['rule_file']), account_file)
        account_conf = yaml_loader(account_file_path)
        if 'user' not in account_conf or 'password' not in account_conf:
            raise EAException('Account file must have user and password fields')
        self.user = account_conf['user']
        self.password = account_conf['password']

class StompAlerter(Alerter):
    """ The stomp alerter publishes alerts via stomp to a broker. """
    required_options = frozenset(
        ['stomp_hostname', 'stomp_hostport', 'stomp_login', 'stomp_password'])

    def alert(self, matches):
        alerts = []

        qk = self.rule.get('query_key', None)

        fullmessage = {}
        for match in matches:
            if qk is not None:
                resmatch = lookup_es_key(match, qk)
            else:
                resmatch = None

            if resmatch is not None:
                elastalert_logger.info(
                    'Alert for %s, %s at %s:' % (self.rule['name'], resmatch, lookup_es_key(match, self.rule['timestamp_field'])))
                alerts.append(
                    'Alert for %s, %s at %s:' % (self.rule['name'], resmatch, lookup_es_key(
                        match, self.rule['timestamp_field']))
                )
                fullmessage['match'] = resmatch
            else:
                elastalert_logger.info('Rule %s generated an alert at %s:' % (
                    self.rule['name'], lookup_es_key(match, self.rule['timestamp_field'])))
                alerts.append(
                    'Rule %s generated an alert at %s:' % (self.rule['name'], lookup_es_key(
                        match, self.rule['timestamp_field']))
                )
                fullmessage['match'] = lookup_es_key(
                    match, self.rule['timestamp_field'])
            elastalert_logger.info(str(BasicMatchString(self.rule, match)))

        fullmessage['alerts'] = alerts
        fullmessage['rule'] = self.rule['name']
        fullmessage['rule_file'] = self.rule['rule_file']

        fullmessage['matching'] = str(BasicMatchString(self.rule, match))
        fullmessage['alertDate'] = datetime.datetime.now(
        ).strftime("%Y-%m-%d %H:%M:%S")
        fullmessage['body'] = self.create_alert_body(matches)

        fullmessage['matches'] = matches

        self.stomp_hostname = self.rule.get('stomp_hostname', 'localhost')
        self.stomp_hostport = self.rule.get('stomp_hostport', '61613')
        self.stomp_login = self.rule.get('stomp_login', 'admin')
        self.stomp_password = self.rule.get('stomp_password', 'admin')
        self.stomp_destination = self.rule.get(
            'stomp_destination', '/queue/ALERT')
        self.stomp_ssl = self.rule.get('stomp_ssl', False)

        conn = stomp.Connection([(self.stomp_hostname, self.stomp_hostport)], use_ssl=self.stomp_ssl)

        conn.start()
        conn.connect(self.stomp_login, self.stomp_password)
        # Ensures that the CONNECTED frame is received otherwise, the disconnect call will fail.
        time.sleep(1)
        conn.send(self.stomp_destination, json.dumps(fullmessage))
        conn.disconnect()

    def get_info(self):
        return {'type': 'stomp'}

class DebugAlerter(Alerter):
    """ The debug alerter uses a Python logger (by default, alerting to terminal). """

    def alert(self, matches):
        qk = self.rule.get('query_key', None)
        for match in matches:
            if qk in match:
                elastalert_logger.info(
                    'Alert for %s, %s at %s:' % (self.rule['name'], match[qk], lookup_es_key(match, self.rule['timestamp_field'])))
            else:
                elastalert_logger.info('Alert for %s at %s:' % (self.rule['name'], lookup_es_key(match, self.rule['timestamp_field'])))
            elastalert_logger.info(str(BasicMatchString(self.rule, match)))

    def get_info(self):
        return {'type': 'debug'}

class EmailAlerter(Alerter):
    """ Sends an email alert """
    required_options = frozenset(['email'])

    def __init__(self, *args):
        super(EmailAlerter, self).__init__(*args)

        self.smtp_host = self.rule.get('smtp_host', 'localhost')
        self.smtp_ssl = self.rule.get('smtp_ssl', False)
        self.from_addr = self.rule.get('from_addr', 'ElastAlert')
        self.smtp_port = self.rule.get('smtp_port')
        if self.rule.get('smtp_auth_file'):
            self.get_account(self.rule['smtp_auth_file'])
        self.smtp_key_file = self.rule.get('smtp_key_file')
        self.smtp_cert_file = self.rule.get('smtp_cert_file')
        # Convert email to a list if it isn't already
        if isinstance(self.rule['email'], str):
            self.rule['email'] = [self.rule['email']]
        # If there is a cc then also convert it a list if it isn't
        cc = self.rule.get('cc')
        if cc and isinstance(cc, str):
            self.rule['cc'] = [self.rule['cc']]
        # If there is a bcc then also convert it to a list if it isn't
        bcc = self.rule.get('bcc')
        if bcc and isinstance(bcc, str):
            self.rule['bcc'] = [self.rule['bcc']]
        add_suffix = self.rule.get('email_add_domain')
        if add_suffix and not add_suffix.startswith('@'):
            self.rule['email_add_domain'] = '@' + add_suffix

    def alert(self, matches):
        body = self.create_alert_body(matches)

        # Add JIRA ticket if it exists
        if self.pipeline is not None and 'jira_ticket' in self.pipeline:
            url = '%s/browse/%s' % (self.pipeline['jira_server'], self.pipeline['jira_ticket'])
            body += '\nJIRA ticket: %s' % (url)

        to_addr = self.rule['email']
        if 'email_from_field' in self.rule:
            recipient = lookup_es_key(matches[0], self.rule['email_from_field'])
            if isinstance(recipient, str):
                if '@' in recipient:
                    to_addr = [recipient]
                elif 'email_add_domain' in self.rule:
                    to_addr = [recipient + self.rule['email_add_domain']]
            elif isinstance(recipient, list):
                to_addr = recipient
                if 'email_add_domain' in self.rule:
                    to_addr = [name + self.rule['email_add_domain'] for name in to_addr]
        if self.rule.get('email_format') == 'html':
            email_msg = MIMEText(body, 'html', _charset='UTF-8')
        else:
            email_msg = MIMEText(body, _charset='UTF-8')
        email_msg['Subject'] = self.create_title(matches)
        email_msg['To'] = ', '.join(to_addr)
        email_msg['From'] = self.from_addr
        email_msg['Reply-To'] = self.rule.get('email_reply_to', email_msg['To'])
        email_msg['Date'] = formatdate()
        if self.rule.get('cc'):
            email_msg['CC'] = ','.join(self.rule['cc'])
            to_addr = to_addr + self.rule['cc']
        if self.rule.get('bcc'):
            to_addr = to_addr + self.rule['bcc']

        try:
            if self.smtp_ssl:
                if self.smtp_port:
                    self.smtp = SMTP_SSL(self.smtp_host, self.smtp_port, keyfile=self.smtp_key_file, certfile=self.smtp_cert_file)
                else:
                    self.smtp = SMTP_SSL(self.smtp_host, keyfile=self.smtp_key_file, certfile=self.smtp_cert_file)
            else:
                if self.smtp_port:
                    self.smtp = SMTP(self.smtp_host, self.smtp_port)
                else:
                    self.smtp = SMTP(self.smtp_host)
                self.smtp.ehlo()
                if self.smtp.has_extn('STARTTLS'):
                    self.smtp.starttls(keyfile=self.smtp_key_file, certfile=self.smtp_cert_file)
            if 'smtp_auth_file' in self.rule:
                self.smtp.login(self.user, self.password)
        except (SMTPException, error) as e:
            raise EAException("Error connecting to SMTP host: %s" % (e))
        except SMTPAuthenticationError as e:
            raise EAException("SMTP username/password rejected: %s" % (e))
        self.smtp.sendmail(self.from_addr, to_addr, email_msg.as_string())
        self.smtp.quit()

        elastalert_logger.info("Sent email to %s" % (to_addr))

    def create_default_title(self, matches):
        subject = 'ElastAlert: %s' % (self.rule['name'])

        # If the rule has a query_key, add that value plus timestamp to subject
        if 'query_key' in self.rule:
            qk = matches[0].get(self.rule['query_key'])
            if qk:
                subject += ' - %s' % (qk)

        return subject

    def get_info(self):
        return {'type': 'email',
                'recipients': self.rule['email']}

class JiraAlerter(Alerter):
    """ Creates a Jira ticket for each alert """
    required_options = frozenset(['jira_server', 'jira_account_file', 'jira_project', 'jira_issuetype'])

    # Maintain a static set of built-in fields that we explicitly know how to set
    # For anything else, we will do best-effort and try to set a string value
    known_field_list = [
        'jira_account_file',
        'jira_assignee',
        'jira_bump_after_inactivity',
        'jira_bump_in_statuses',
        'jira_bump_not_in_statuses',
        'jira_bump_only',
        'jira_bump_tickets',
        'jira_component',
        'jira_components',
        'jira_description',
        'jira_ignore_in_title',
        'jira_issuetype',
        'jira_label',
        'jira_labels',
        'jira_max_age',
        'jira_priority',
        'jira_project',
        'jira_server',
        'jira_transition_to',
        'jira_watchers',
    ]

    # Some built-in jira types that can be used as custom fields require special handling
    # Here is a sample of one of them:
    # {"id":"customfield_12807","name":"My Custom Field","custom":true,"orderable":true,"navigable":true,"searchable":true,
    # "clauseNames":["cf[12807]","My Custom Field"],"schema":{"type":"array","items":"string",
    # "custom":"com.atlassian.jira.plugin.system.customfieldtypes:multiselect","customId":12807}}
    # There are likely others that will need to be updated on a case-by-case basis
    custom_string_types_with_special_handling = [
        'com.atlassian.jira.plugin.system.customfieldtypes:multicheckboxes',
        'com.atlassian.jira.plugin.system.customfieldtypes:multiselect',
        'com.atlassian.jira.plugin.system.customfieldtypes:radiobuttons',
    ]

    def __init__(self, rule):
        super(JiraAlerter, self).__init__(rule)
        self.server = self.rule['jira_server']
        self.get_account(self.rule['jira_account_file'])
        self.project = self.rule['jira_project']
        self.issue_type = self.rule['jira_issuetype']

        # Deferred settings refer to values that can only be resolved when a match
        # is found and as such loading them will be delayed until we find a match
        self.deferred_settings = []

        # We used to support only a single component. This allows us to maintain backwards compatibility
        # while also giving the user-facing API a more representative name
        self.components = self.rule.get('jira_components', self.rule.get('jira_component'))

        # We used to support only a single label. This allows us to maintain backwards compatibility
        # while also giving the user-facing API a more representative name
        self.labels = self.rule.get('jira_labels', self.rule.get('jira_label'))

        self.description = self.rule.get('jira_description', '')
        self.assignee = self.rule.get('jira_assignee')
        self.max_age = self.rule.get('jira_max_age', 30)
        self.priority = self.rule.get('jira_priority')
        self.bump_tickets = self.rule.get('jira_bump_tickets', False)
        self.bump_not_in_statuses = self.rule.get('jira_bump_not_in_statuses')
        self.bump_in_statuses = self.rule.get('jira_bump_in_statuses')
        self.bump_after_inactivity = self.rule.get('jira_bump_after_inactivity', 0)
        self.bump_only = self.rule.get('jira_bump_only', False)
        self.transition = self.rule.get('jira_transition_to', False)
        self.watchers = self.rule.get('jira_watchers')
        self.client = None

        if self.bump_in_statuses and self.bump_not_in_statuses:
            msg = 'Both jira_bump_in_statuses (%s) and jira_bump_not_in_statuses (%s) are set.' % \
                  (','.join(self.bump_in_statuses), ','.join(self.bump_not_in_statuses))
            intersection = list(set(self.bump_in_statuses) & set(self.bump_in_statuses))
            if intersection:
                msg = '%s Both have common statuses of (%s). As such, no tickets will ever be found.' % (
                    msg, ','.join(intersection))
            msg += ' This should be simplified to use only one or the other.'
            logging.warning(msg)

        self.reset_jira_args()

        try:
            self.client = JIRA(self.server, basic_auth=(self.user, self.password))
            self.get_priorities()
            self.jira_fields = self.client.fields()
            self.get_arbitrary_fields()
        except JIRAError as e:
            # JIRAError may contain HTML, pass along only first 1024 chars
            raise EAException("Error connecting to JIRA: %s" % (str(e)[:1024])).with_traceback(sys.exc_info()[2])

        self.set_priority()

    def set_priority(self):
        try:
            if self.priority is not None and self.client is not None:
                self.jira_args['priority'] = {'id': self.priority_ids[self.priority]}
        except KeyError:
            logging.error("Priority %s not found. Valid priorities are %s" % (self.priority, list(self.priority_ids.keys())))

    def reset_jira_args(self):
        self.jira_args = {'project': {'key': self.project},
                          'issuetype': {'name': self.issue_type}}

        if self.components:
            # Support single component or list
            if type(self.components) != list:
                self.jira_args['components'] = [{'name': self.components}]
            else:
                self.jira_args['components'] = [{'name': component} for component in self.components]
        if self.labels:
            # Support single label or list
            if type(self.labels) != list:
                self.labels = [self.labels]
            self.jira_args['labels'] = self.labels
        if self.watchers:
            # Support single watcher or list
            if type(self.watchers) != list:
                self.watchers = [self.watchers]
        if self.assignee:
            self.jira_args['assignee'] = {'name': self.assignee}

        self.set_priority()

    def set_jira_arg(self, jira_field, value, fields):
        # Remove the jira_ part.  Convert underscores to spaces
        normalized_jira_field = jira_field[5:].replace('_', ' ').lower()
        # All jira fields should be found in the 'id' or the 'name' field. Therefore, try both just in case
        for identifier in ['name', 'id']:
            field = next((f for f in fields if normalized_jira_field == f[identifier].replace('_', ' ').lower()), None)
            if field:
                break
        if not field:
            # Log a warning to ElastAlert saying that we couldn't find that type?
            # OR raise and fail to load the alert entirely? Probably the latter...
            raise Exception("Could not find a definition for the jira field '{0}'".format(normalized_jira_field))
        arg_name = field['id']
        # Check the schema information to decide how to set the value correctly
        # If the schema information is not available, raise an exception since we don't know how to set it
        # Note this is only the case for two built-in types, id: issuekey and id: thumbnail
        if not ('schema' in field or 'type' in field['schema']):
            raise Exception("Could not determine schema information for the jira field '{0}'".format(normalized_jira_field))
        arg_type = field['schema']['type']

        # Handle arrays of simple types like strings or numbers
        if arg_type == 'array':
            # As a convenience, support the scenario wherein the user only provides
            # a single value for a multi-value field e.g. jira_labels: Only_One_Label
            if type(value) != list:
                value = [value]
            array_items = field['schema']['items']
            # Simple string types
            if array_items in ['string', 'date', 'datetime']:
                # Special case for multi-select custom types (the JIRA metadata says that these are strings, but
                # in reality, they are required to be provided as an object.
                if 'custom' in field['schema'] and field['schema']['custom'] in self.custom_string_types_with_special_handling:
                    self.jira_args[arg_name] = [{'value': v} for v in value]
                else:
                    self.jira_args[arg_name] = value
            elif array_items == 'number':
                self.jira_args[arg_name] = [int(v) for v in value]
            # Also attempt to handle arrays of complex types that have to be passed as objects with an identifier 'key'
            elif array_items == 'option':
                self.jira_args[arg_name] = [{'value': v} for v in value]
            else:
                # Try setting it as an object, using 'name' as the key
                # This may not work, as the key might actually be 'key', 'id', 'value', or something else
                # If it works, great!  If not, it will manifest itself as an API error that will bubble up
                self.jira_args[arg_name] = [{'name': v} for v in value]
        # Handle non-array types
        else:
            # Simple string types
            if arg_type in ['string', 'date', 'datetime']:
                # Special case for custom types (the JIRA metadata says that these are strings, but
                # in reality, they are required to be provided as an object.
                if 'custom' in field['schema'] and field['schema']['custom'] in self.custom_string_types_with_special_handling:
                    self.jira_args[arg_name] = {'value': value}
                else:
                    self.jira_args[arg_name] = value
            # Number type
            elif arg_type == 'number':
                self.jira_args[arg_name] = int(value)
            elif arg_type == 'option':
                self.jira_args[arg_name] = {'value': value}
            # Complex type
            else:
                self.jira_args[arg_name] = {'name': value}

    def get_arbitrary_fields(self):
        # Clear jira_args
        self.reset_jira_args()

        for jira_field, value in self.rule.items():
            # If we find a field that is not covered by the set that we are aware of, it means it is either:
            # 1. A built-in supported field in JIRA that we don't have on our radar
            # 2. A custom field that a JIRA admin has configured
            if jira_field.startswith('jira_') and jira_field not in self.known_field_list and str(value)[:1] != '#':
                self.set_jira_arg(jira_field, value, self.jira_fields)
            if jira_field.startswith('jira_') and jira_field not in self.known_field_list and str(value)[:1] == '#':
                self.deferred_settings.append(jira_field)

    def get_priorities(self):
        """ Creates a mapping of priority index to id. """
        priorities = self.client.priorities()
        self.priority_ids = {}
        for x in range(len(priorities)):
            self.priority_ids[x] = priorities[x].id

    def set_assignee(self, assignee):
        self.assignee = assignee
        if assignee:
            self.jira_args['assignee'] = {'name': assignee}
        elif 'assignee' in self.jira_args:
            self.jira_args.pop('assignee')

    def find_existing_ticket(self, matches):
        # Default title, get stripped search version
        if 'alert_subject' not in self.rule:
            title = self.create_default_title(matches, True)
        else:
            title = self.create_title(matches)

        if 'jira_ignore_in_title' in self.rule:
            title = title.replace(matches[0].get(self.rule['jira_ignore_in_title'], ''), '')

        # This is necessary for search to work. Other special characters and dashes
        # directly adjacent to words appear to be ok
        title = title.replace(' - ', ' ')
        title = title.replace('\\', '\\\\')

        date = (datetime.datetime.now() - datetime.timedelta(days=self.max_age)).strftime('%Y-%m-%d')
        jql = 'project=%s AND summary~"%s" and created >= "%s"' % (self.project, title, date)
        if self.bump_in_statuses:
            jql = '%s and status in (%s)' % (jql, ','.join(["\"%s\"" % status if ' ' in status else status for status
                                                            in self.bump_in_statuses]))
        if self.bump_not_in_statuses:
            jql = '%s and status not in (%s)' % (jql, ','.join(["\"%s\"" % status if ' ' in status else status
                                                                for status in self.bump_not_in_statuses]))
        try:
            issues = self.client.search_issues(jql)
        except JIRAError as e:
            logging.exception("Error while searching for JIRA ticket using jql '%s': %s" % (jql, e))
            return None

        if len(issues):
            return issues[0]

    def comment_on_ticket(self, ticket, match):
        text = str(JiraFormattedMatchString(self.rule, match))
        timestamp = pretty_ts(lookup_es_key(match, self.rule['timestamp_field']))
        comment = "This alert was triggered again at %s\n%s" % (timestamp, text)
        self.client.add_comment(ticket, comment)

    def transition_ticket(self, ticket):
        transitions = self.client.transitions(ticket)
        for t in transitions:
            if t['name'] == self.transition:
                self.client.transition_issue(ticket, t['id'])

    def alert(self, matches):
        # Reset arbitrary fields to pick up changes
        self.get_arbitrary_fields()
        if len(self.deferred_settings) > 0:
            fields = self.client.fields()
            for jira_field in self.deferred_settings:
                value = lookup_es_key(matches[0], self.rule[jira_field][1:])
                self.set_jira_arg(jira_field, value, fields)

        title = self.create_title(matches)

        if self.bump_tickets:
            ticket = self.find_existing_ticket(matches)
            if ticket:
                inactivity_datetime = ts_now() - datetime.timedelta(days=self.bump_after_inactivity)
                if ts_to_dt(ticket.fields.updated) >= inactivity_datetime:
                    if self.pipeline is not None:
                        self.pipeline['jira_ticket'] = None
                        self.pipeline['jira_server'] = self.server
                    return None
                elastalert_logger.info('Commenting on existing ticket %s' % (ticket.key))
                for match in matches:
                    try:
                        self.comment_on_ticket(ticket, match)
                    except JIRAError as e:
                        logging.exception("Error while commenting on ticket %s: %s" % (ticket, e))
                    if self.labels:
                        for label in self.labels:
                            try:
                                ticket.fields.labels.append(label)
                            except JIRAError as e:
                                logging.exception("Error while appending labels to ticket %s: %s" % (ticket, e))
                if self.transition:
                    elastalert_logger.info('Transitioning existing ticket %s' % (ticket.key))
                    try:
                        self.transition_ticket(ticket)
                    except JIRAError as e:
                        logging.exception("Error while transitioning ticket %s: %s" % (ticket, e))

                if self.pipeline is not None:
                    self.pipeline['jira_ticket'] = ticket
                    self.pipeline['jira_server'] = self.server
                return None
        if self.bump_only:
            return None

        self.jira_args['summary'] = title
        self.jira_args['description'] = self.create_alert_body(matches)

        try:
            self.issue = self.client.create_issue(**self.jira_args)

            # You can not add watchers on initial creation. Only as a follow-up action
            if self.watchers:
                for watcher in self.watchers:
                    try:
                        self.client.add_watcher(self.issue.key, watcher)
                    except Exception as ex:
                        # Re-raise the exception, preserve the stack-trace, and give some
                        # context as to which watcher failed to be added
                        raise Exception(
                            "Exception encountered when trying to add '{0}' as a watcher. Does the user exist?\n{1}" .format(
                                watcher,
                                ex
                            )).with_traceback(sys.exc_info()[2])

        except JIRAError as e:
            raise EAException("Error creating JIRA ticket using jira_args (%s): %s" % (self.jira_args, e))
        elastalert_logger.info("Opened Jira ticket: %s" % (self.issue))

        if self.pipeline is not None:
            self.pipeline['jira_ticket'] = self.issue
            self.pipeline['jira_server'] = self.server

    def create_alert_body(self, matches):
        body = self.description + '\n'
        body += self.get_aggregation_summary_text(matches)
        if self.rule.get('alert_text_type') != 'aggregation_summary_only':
            for match in matches:
                body += str(JiraFormattedMatchString(self.rule, match))
                if len(matches) > 1:
                    body += '\n----------------------------------------\n'
        return body

    def get_aggregation_summary_text(self, matches):
        text = super(JiraAlerter, self).get_aggregation_summary_text(matches)
        if text:
            text = '{{noformat}}{0}{{noformat}}'.format(text)
        return text

    def create_default_title(self, matches, for_search=False):
        # If there is a query_key, use that in the title

        if 'query_key' in self.rule and lookup_es_key(matches[0], self.rule['query_key']):
            title = 'ElastAlert: %s matched %s' % (lookup_es_key(matches[0], self.rule['query_key']), self.rule['name'])
        else:
            title = 'ElastAlert: %s' % (self.rule['name'])

        if for_search:
            return title

        title += ' - %s' % (pretty_ts(matches[0][self.rule['timestamp_field']], self.rule.get('use_local_time')))

        # Add count for spikes
        count = matches[0].get('spike_count')
        if count:
            title += ' - %s+ events' % (count)

        return title

    def get_info(self):
        return {'type': 'jira'}

class CommandAlerter(Alerter):
    required_options = set(['command'])

    def __init__(self, *args):
        super(CommandAlerter, self).__init__(*args)

        self.last_command = []

        self.shell = False
        if isinstance(self.rule['command'], str):
            self.shell = True
            if '%' in self.rule['command']:
                logging.warning('Warning! You could be vulnerable to shell injection!')
            self.rule['command'] = [self.rule['command']]

        self.new_style_string_format = False
        if 'new_style_string_format' in self.rule and self.rule['new_style_string_format']:
            self.new_style_string_format = True

    def alert(self, matches):
        # Format the command and arguments
        try:
            command = [resolve_string(command_arg, matches[0]) for command_arg in self.rule['command']]
            self.last_command = command
        except KeyError as e:
            raise EAException("Error formatting command: %s" % (e))

        # Run command and pipe data
        try:
            subp = subprocess.Popen(command, stdin=subprocess.PIPE, shell=self.shell)

            if self.rule.get('pipe_match_json'):
                match_json = json.dumps(matches, cls=DateTimeEncoder) + '\n'
                stdout, stderr = subp.communicate(input=match_json.encode())
            elif self.rule.get('pipe_alert_text'):
                alert_text = self.create_alert_body(matches)
                stdout, stderr = subp.communicate(input=alert_text.encode())
            if self.rule.get("fail_on_non_zero_exit", False) and subp.wait():
                raise EAException("Non-zero exit code while running command %s" % (' '.join(command)))
        except OSError as e:
            raise EAException("Error while running command %s: %s" % (' '.join(command), e))

    def get_info(self):
        return {'type': 'command',
                'command': ' '.join(self.last_command)}

class SnsAlerter(Alerter):
    """ Send alert using AWS SNS service """
    required_options = frozenset(['sns_topic_arn'])

    def __init__(self, *args):
        super(SnsAlerter, self).__init__(*args)
        self.sns_topic_arn = self.rule.get('sns_topic_arn', '')
        self.sns_aws_access_key_id = self.rule.get('sns_aws_access_key_id')
        self.sns_aws_secret_access_key = self.rule.get('sns_aws_secret_access_key')
        self.sns_aws_region = self.rule.get('sns_aws_region', 'us-east-1')
        self.profile = self.rule.get('boto_profile', None)  # Deprecated
        self.profile = self.rule.get('sns_aws_profile', None)

    def create_default_title(self, matches):
        subject = 'ElastAlert: %s' % (self.rule['name'])
        return subject

    def alert(self, matches):
        body = self.create_alert_body(matches)

        if self.profile is None:
            session = boto3.Session(
                aws_access_key_id=self.sns_aws_access_key_id,
                aws_secret_access_key=self.sns_aws_access_key_id,
                region_name=self.sns_aws_region
            )
        else:
            session = boto3.Session(profile_name=self.profile)

        sns_client = session.client('sns')
        sns_client.publish(
            TopicArn=self.sns_topic_arn,
            Message=body,
            Subject=self.create_title(matches)
        )
        elastalert_logger.info("Sent sns notification to %s" % (self.sns_topic_arn))

class HipChatAlerter(Alerter):
    """ Creates a HipChat room notification for each alert """
    required_options = frozenset(['hipchat_auth_token', 'hipchat_room_id'])

    def __init__(self, rule):
        super(HipChatAlerter, self).__init__(rule)
        self.hipchat_msg_color = self.rule.get('hipchat_msg_color', 'red')
        self.hipchat_message_format = self.rule.get('hipchat_message_format', 'html')
        self.hipchat_auth_token = self.rule['hipchat_auth_token']
        self.hipchat_room_id = self.rule['hipchat_room_id']
        self.hipchat_domain = self.rule.get('hipchat_domain', 'api.hipchat.com')
        self.hipchat_ignore_ssl_errors = self.rule.get('hipchat_ignore_ssl_errors', False)
        self.hipchat_notify = self.rule.get('hipchat_notify', True)
        self.hipchat_from = self.rule.get('hipchat_from', '')
        self.url = 'https://%s/v2/room/%s/notification?auth_token=%s' % (
            self.hipchat_domain, self.hipchat_room_id, self.hipchat_auth_token)
        self.hipchat_proxy = self.rule.get('hipchat_proxy', None)

    def create_alert_body(self, matches):
        body = super(HipChatAlerter, self).create_alert_body(matches)

        # HipChat sends 400 bad request on messages longer than 10000 characters
        if self.hipchat_message_format == 'html':
            # Use appropriate line ending for text/html
            br = '<br/>'
            body = body.replace('\n', br)

            truncated_message = '<br/> ...(truncated)'
            truncate_to = 10000 - len(truncated_message)
        else:
            truncated_message = '..(truncated)'
            truncate_to = 10000 - len(truncated_message)

        if (len(body) > 9999):
            body = body[:truncate_to] + truncated_message

        return body

    def alert(self, matches):
        body = self.create_alert_body(matches)

        # Post to HipChat
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.hipchat_proxy} if self.hipchat_proxy else None
        payload = {
            'color': self.hipchat_msg_color,
            'message': body,
            'message_format': self.hipchat_message_format,
            'notify': self.hipchat_notify,
            'from': self.hipchat_from
        }

        try:
            if self.hipchat_ignore_ssl_errors:
                requests.packages.urllib3.disable_warnings()

            if self.rule.get('hipchat_mentions', []):
                ping_users = self.rule.get('hipchat_mentions', [])
                ping_msg = payload.copy()
                ping_msg['message'] = "ping {}".format(
                    ", ".join("@{}".format(user) for user in ping_users)
                )
                ping_msg['message_format'] = "text"

                response = requests.post(
                    self.url,
                    data=json.dumps(ping_msg, cls=DateTimeEncoder),
                    headers=headers,
                    verify=not self.hipchat_ignore_ssl_errors,
                    proxies=proxies)

            response = requests.post(self.url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers,
                                     verify=not self.hipchat_ignore_ssl_errors,
                                     proxies=proxies)
            warnings.resetwarnings()
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to HipChat: %s" % e)
        elastalert_logger.info("Alert sent to HipChat room %s" % self.hipchat_room_id)

    def get_info(self):
        return {'type': 'hipchat',
                'hipchat_room_id': self.hipchat_room_id}

class MsTeamsAlerter(Alerter):
    """ Creates a Microsoft Teams Conversation Message for each alert """
    required_options = frozenset(['ms_teams_webhook_url', 'ms_teams_alert_summary'])

    def __init__(self, rule):
        super(MsTeamsAlerter, self).__init__(rule)
        self.ms_teams_webhook_url = self.rule['ms_teams_webhook_url']
        if isinstance(self.ms_teams_webhook_url, str):
            self.ms_teams_webhook_url = [self.ms_teams_webhook_url]
        self.ms_teams_proxy = self.rule.get('ms_teams_proxy', None)
        self.ms_teams_alert_summary = self.rule.get('ms_teams_alert_summary', 'ElastAlert Message')
        self.ms_teams_alert_fixed_width = self.rule.get('ms_teams_alert_fixed_width', False)
        self.ms_teams_theme_color = self.rule.get('ms_teams_theme_color', '')

    def format_body(self, body):
        if self.ms_teams_alert_fixed_width:
            body = body.replace('`', "'")
            body = "```{0}```".format('```\n\n```'.join(x for x in body.split('\n'))).replace('\n``````', '')
        return body

    def alert(self, matches):
        body = self.create_alert_body(matches)

        body = self.format_body(body)
        # post to Teams
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.ms_teams_proxy} if self.ms_teams_proxy else None
        payload = {
            '@type': 'MessageCard',
            '@context': 'http://schema.org/extensions',
            'summary': self.ms_teams_alert_summary,
            'title': self.create_title(matches),
            'text': body
        }
        if self.ms_teams_theme_color != '':
            payload['themeColor'] = self.ms_teams_theme_color

        for url in self.ms_teams_webhook_url:
            try:
                response = requests.post(url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies)
                response.raise_for_status()
            except RequestException as e:
                raise EAException("Error posting to ms teams: %s" % e)
        elastalert_logger.info("Alert sent to MS Teams")

    def get_info(self):
        return {'type': 'ms_teams',
                'ms_teams_webhook_url': self.ms_teams_webhook_url}

class SlackAlerter(Alerter):
    """ Creates a Slack room message for each alert """
    required_options = frozenset(['slack_webhook_url'])

    def __init__(self, rule):
        super(SlackAlerter, self).__init__(rule)
        self.slack_webhook_url = self.rule['slack_webhook_url']
        if isinstance(self.slack_webhook_url, str):
            self.slack_webhook_url = [self.slack_webhook_url]
        self.slack_proxy = self.rule.get('slack_proxy', None)
        self.slack_username_override = self.rule.get('slack_username_override', 'elastalert')
        self.slack_channel_override = self.rule.get('slack_channel_override', '')
        if isinstance(self.slack_channel_override, str):
            self.slack_channel_override = [self.slack_channel_override]
        self.slack_title_link = self.rule.get('slack_title_link', '')
        self.slack_title = self.rule.get('slack_title', '')
        self.slack_emoji_override = self.rule.get('slack_emoji_override', ':ghost:')
        self.slack_icon_url_override = self.rule.get('slack_icon_url_override', '')
        self.slack_msg_color = self.rule.get('slack_msg_color', 'danger')
        self.slack_parse_override = self.rule.get('slack_parse_override', 'none')
        self.slack_text_string = self.rule.get('slack_text_string', '')
        self.slack_alert_fields = self.rule.get('slack_alert_fields', '')
        self.slack_ignore_ssl_errors = self.rule.get('slack_ignore_ssl_errors', False)
        self.slack_timeout = self.rule.get('slack_timeout', 10)
        self.slack_ca_certs = self.rule.get('slack_ca_certs')
        self.slack_attach_kibana_discover_url = self.rule.get('slack_attach_kibana_discover_url', False)
        self.slack_kibana_discover_color = self.rule.get('slack_kibana_discover_color', '#ec4b98')
        self.slack_kibana_discover_title = self.rule.get('slack_kibana_discover_title', 'Discover in Kibana')

    def format_body(self, body):
        # https://api.slack.com/docs/formatting
        return body

    def get_aggregation_summary_text__maximum_width(self):
        width = super(SlackAlerter, self).get_aggregation_summary_text__maximum_width()
        # Reduced maximum width for prettier Slack display.
        return min(width, 75)

    def get_aggregation_summary_text(self, matches):
        text = super(SlackAlerter, self).get_aggregation_summary_text(matches)
        if text:
            text = '```\n{0}```\n'.format(text)
        return text

    def populate_fields(self, matches):
        alert_fields = []
        for arg in self.slack_alert_fields:
            arg = copy.copy(arg)
            arg['value'] = lookup_es_key(matches[0], arg['value'])
            alert_fields.append(arg)
        return alert_fields

    def alert(self, matches):
        body = self.create_alert_body(matches)

        body = self.format_body(body)
        # post to slack
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.slack_proxy} if self.slack_proxy else None
        payload = {
            'username': self.slack_username_override,
            'parse': self.slack_parse_override,
            'text': self.slack_text_string,
            'attachments': [
                {
                    'color': self.slack_msg_color,
                    'title': self.create_title(matches),
                    'text': body,
                    'mrkdwn_in': ['text', 'pretext'],
                    'fields': []
                }
            ]
        }

        # if we have defined fields, populate noteable fields for the alert
        if self.slack_alert_fields != '':
            payload['attachments'][0]['fields'] = self.populate_fields(matches)

        if self.slack_icon_url_override != '':
            payload['icon_url'] = self.slack_icon_url_override
        else:
            payload['icon_emoji'] = self.slack_emoji_override

        if self.slack_title != '':
            payload['attachments'][0]['title'] = self.slack_title

        if self.slack_title_link != '':
            payload['attachments'][0]['title_link'] = self.slack_title_link

        if self.slack_attach_kibana_discover_url:
            kibana_discover_url = lookup_es_key(matches[0], 'kibana_discover_url')
            if kibana_discover_url:
                payload['attachments'].append({
                    'color': self.slack_kibana_discover_color,
                    'title': self.slack_kibana_discover_title,
                    'title_link': kibana_discover_url
                })

        for url in self.slack_webhook_url:
            for channel_override in self.slack_channel_override:
                try:
                    if self.slack_ca_certs:
                        verify = self.slack_ca_certs
                    else:
                        verify = self.slack_ignore_ssl_errors
                    if self.slack_ignore_ssl_errors:
                        requests.packages.urllib3.disable_warnings()
                    payload['channel'] = channel_override
                    response = requests.post(
                        url, data=json.dumps(payload, cls=DateTimeEncoder),
                        headers=headers, verify=verify,
                        proxies=proxies,
                        timeout=self.slack_timeout)
                    warnings.resetwarnings()
                    response.raise_for_status()
                except RequestException as e:
                    raise EAException("Error posting to slack: %s" % e)
        elastalert_logger.info("Alert '%s' sent to Slack" % self.rule['name'])

    def get_info(self):
        return {'type': 'slack',
                'slack_username_override': self.slack_username_override}

class MattermostAlerter(Alerter):
    """ Creates a Mattermsot post for each alert """
    required_options = frozenset(['mattermost_webhook_url'])

    def __init__(self, rule):
        super(MattermostAlerter, self).__init__(rule)

        # HTTP config
        self.mattermost_webhook_url = self.rule['mattermost_webhook_url']
        if isinstance(self.mattermost_webhook_url, str):
            self.mattermost_webhook_url = [self.mattermost_webhook_url]
        self.mattermost_proxy = self.rule.get('mattermost_proxy', None)
        self.mattermost_ignore_ssl_errors = self.rule.get('mattermost_ignore_ssl_errors', False)

        # Override webhook config
        self.mattermost_username_override = self.rule.get('mattermost_username_override', 'elastalert')
        self.mattermost_channel_override = self.rule.get('mattermost_channel_override', '')
        self.mattermost_icon_url_override = self.rule.get('mattermost_icon_url_override', '')

        # Message properties
        self.mattermost_msg_pretext = self.rule.get('mattermost_msg_pretext', '')
        self.mattermost_msg_color = self.rule.get('mattermost_msg_color', 'danger')
        self.mattermost_msg_fields = self.rule.get('mattermost_msg_fields', '')

    def get_aggregation_summary_text__maximum_width(self):
        width = super(MattermostAlerter, self).get_aggregation_summary_text__maximum_width()
        # Reduced maximum width for prettier Mattermost display.
        return min(width, 75)

    def get_aggregation_summary_text(self, matches):
        text = super(MattermostAlerter, self).get_aggregation_summary_text(matches)
        if text:
            text = '```\n{0}```\n'.format(text)
        return text

    def populate_fields(self, matches):
        alert_fields = []
        missing = self.rule.get('alert_missing_value', '<MISSING VALUE>')
        for field in self.mattermost_msg_fields:
            field = copy.copy(field)
            if 'args' in field:
                args_values = [lookup_es_key(matches[0], arg) or missing for arg in field['args']]
                if 'value' in field:
                    field['value'] = field['value'].format(*args_values)
                else:
                    field['value'] = "\n".join(str(arg) for arg in args_values)
                del(field['args'])
            alert_fields.append(field)
        return alert_fields

    def alert(self, matches):
        body = self.create_alert_body(matches)
        title = self.create_title(matches)

        # post to mattermost
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.mattermost_proxy} if self.mattermost_proxy else None
        payload = {
            'attachments': [
                {
                    'fallback': "{0}: {1}".format(title, self.mattermost_msg_pretext),
                    'color': self.mattermost_msg_color,
                    'title': title,
                    'pretext': self.mattermost_msg_pretext,
                    'fields': []
                }
            ]
        }

        if self.rule.get('alert_text_type') == 'alert_text_only':
            payload['attachments'][0]['text'] = body
        else:
            payload['text'] = body

        if self.mattermost_msg_fields != '':
            payload['attachments'][0]['fields'] = self.populate_fields(matches)

        if self.mattermost_icon_url_override != '':
            payload['icon_url'] = self.mattermost_icon_url_override

        if self.mattermost_username_override != '':
            payload['username'] = self.mattermost_username_override

        if self.mattermost_channel_override != '':
            payload['channel'] = self.mattermost_channel_override

        for url in self.mattermost_webhook_url:
            try:
                if self.mattermost_ignore_ssl_errors:
                    requests.urllib3.disable_warnings()

                response = requests.post(
                    url, data=json.dumps(payload, cls=DateTimeEncoder),
                    headers=headers, verify=not self.mattermost_ignore_ssl_errors,
                    proxies=proxies)

                warnings.resetwarnings()
                response.raise_for_status()
            except RequestException as e:
                raise EAException("Error posting to Mattermost: %s" % e)
        elastalert_logger.info("Alert sent to Mattermost")

    def get_info(self):
        return {'type': 'mattermost',
                'mattermost_username_override': self.mattermost_username_override,
                'mattermost_webhook_url': self.mattermost_webhook_url}

class PagerDutyAlerter(Alerter):
    """ Create an incident on PagerDuty for each alert """
    required_options = frozenset(['pagerduty_service_key', 'pagerduty_client_name'])

    def __init__(self, rule):
        super(PagerDutyAlerter, self).__init__(rule)
        self.pagerduty_service_key = self.rule['pagerduty_service_key']
        self.pagerduty_client_name = self.rule['pagerduty_client_name']
        self.pagerduty_incident_key = self.rule.get('pagerduty_incident_key', '')
        self.pagerduty_incident_key_args = self.rule.get('pagerduty_incident_key_args', None)
        self.pagerduty_event_type = self.rule.get('pagerduty_event_type', 'trigger')
        self.pagerduty_proxy = self.rule.get('pagerduty_proxy', None)

        self.pagerduty_api_version = self.rule.get('pagerduty_api_version', 'v1')
        self.pagerduty_v2_payload_class = self.rule.get('pagerduty_v2_payload_class', '')
        self.pagerduty_v2_payload_class_args = self.rule.get('pagerduty_v2_payload_class_args', None)
        self.pagerduty_v2_payload_component = self.rule.get('pagerduty_v2_payload_component', '')
        self.pagerduty_v2_payload_component_args = self.rule.get('pagerduty_v2_payload_component_args', None)
        self.pagerduty_v2_payload_group = self.rule.get('pagerduty_v2_payload_group', '')
        self.pagerduty_v2_payload_group_args = self.rule.get('pagerduty_v2_payload_group_args', None)
        self.pagerduty_v2_payload_severity = self.rule.get('pagerduty_v2_payload_severity', 'critical')
        self.pagerduty_v2_payload_source = self.rule.get('pagerduty_v2_payload_source', 'ElastAlert')
        self.pagerduty_v2_payload_source_args = self.rule.get('pagerduty_v2_payload_source_args', None)

        if self.pagerduty_api_version == 'v2':
            self.url = 'https://events.pagerduty.com/v2/enqueue'
        else:
            self.url = 'https://events.pagerduty.com/generic/2010-04-15/create_event.json'

    def alert(self, matches):
        body = self.create_alert_body(matches)

        # post to pagerduty
        headers = {'content-type': 'application/json'}
        if self.pagerduty_api_version == 'v2':
            payload = {
                'routing_key': self.pagerduty_service_key,
                'event_action': self.pagerduty_event_type,
                'dedup_key': self.get_incident_key(matches),
                'client': self.pagerduty_client_name,
                'payload': {
                    'class': self.resolve_formatted_key(self.pagerduty_v2_payload_class,
                                                        self.pagerduty_v2_payload_class_args,
                                                        matches),
                    'component': self.resolve_formatted_key(self.pagerduty_v2_payload_component,
                                                            self.pagerduty_v2_payload_component_args,
                                                            matches),
                    'group': self.resolve_formatted_key(self.pagerduty_v2_payload_group,
                                                        self.pagerduty_v2_payload_group_args,
                                                        matches),
                    'severity': self.pagerduty_v2_payload_severity,
                    'source': self.resolve_formatted_key(self.pagerduty_v2_payload_source,
                                                         self.pagerduty_v2_payload_source_args,
                                                         matches),
                    'summary': self.create_title(matches),
                    'custom_details': {
                        'information': body,
                    },
                },
            }
            match_timestamp = lookup_es_key(matches[0], self.rule.get('timestamp_field', '@timestamp'))
            if match_timestamp:
                payload['payload']['timestamp'] = match_timestamp
        else:
            payload = {
                'service_key': self.pagerduty_service_key,
                'description': self.create_title(matches),
                'event_type': self.pagerduty_event_type,
                'incident_key': self.get_incident_key(matches),
                'client': self.pagerduty_client_name,
                'details': {
                    "information": body,
                },
            }

        # set https proxy, if it was provided
        proxies = {'https': self.pagerduty_proxy} if self.pagerduty_proxy else None
        try:
            response = requests.post(
                self.url,
                data=json.dumps(payload, cls=DateTimeEncoder, ensure_ascii=False),
                headers=headers,
                proxies=proxies
            )
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to pagerduty: %s" % e)

        if self.pagerduty_event_type == 'trigger':
            elastalert_logger.info("Trigger sent to PagerDuty")
        elif self.pagerduty_event_type == 'resolve':
            elastalert_logger.info("Resolve sent to PagerDuty")
        elif self.pagerduty_event_type == 'acknowledge':
            elastalert_logger.info("acknowledge sent to PagerDuty")

    def resolve_formatted_key(self, key, args, matches):
        if args:
            key_values = [lookup_es_key(matches[0], arg) for arg in args]

            # Populate values with rule level properties too
            for i in range(len(key_values)):
                if key_values[i] is None:
                    key_value = self.rule.get(args[i])
                    if key_value:
                        key_values[i] = key_value

            missing = self.rule.get('alert_missing_value', '<MISSING VALUE>')
            key_values = [missing if val is None else val for val in key_values]
            return key.format(*key_values)
        else:
            return key

    def get_incident_key(self, matches):
        if self.pagerduty_incident_key_args:
            incident_key_values = [lookup_es_key(matches[0], arg) for arg in self.pagerduty_incident_key_args]

            # Populate values with rule level properties too
            for i in range(len(incident_key_values)):
                if incident_key_values[i] is None:
                    key_value = self.rule.get(self.pagerduty_incident_key_args[i])
                    if key_value:
                        incident_key_values[i] = key_value

            missing = self.rule.get('alert_missing_value', '<MISSING VALUE>')
            incident_key_values = [missing if val is None else val for val in incident_key_values]
            return self.pagerduty_incident_key.format(*incident_key_values)
        else:
            return self.pagerduty_incident_key

    def get_info(self):
        return {'type': 'pagerduty',
                'pagerduty_client_name': self.pagerduty_client_name}

class PagerTreeAlerter(Alerter):
    """ Creates a PagerTree Incident for each alert """
    required_options = frozenset(['pagertree_integration_url'])

    def __init__(self, rule):
        super(PagerTreeAlerter, self).__init__(rule)
        self.url = self.rule['pagertree_integration_url']
        self.pagertree_proxy = self.rule.get('pagertree_proxy', None)

    def alert(self, matches):
        # post to pagertree
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.pagertree_proxy} if self.pagertree_proxy else None
        payload = {
            "event_type": "create",
            "Id": str(uuid.uuid4()),
            "Title": self.create_title(matches),
            "Description": self.create_alert_body(matches)
        }

        try:
            response = requests.post(self.url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to PagerTree: %s" % e)
        elastalert_logger.info("Trigger sent to PagerTree")

    def get_info(self):
        return {'type': 'pagertree',
                'pagertree_integration_url': self.url}

class ExotelAlerter(Alerter):
    required_options = frozenset(['exotel_account_sid', 'exotel_auth_token', 'exotel_to_number', 'exotel_from_number'])

    def __init__(self, rule):
        super(ExotelAlerter, self).__init__(rule)
        self.exotel_account_sid = self.rule['exotel_account_sid']
        self.exotel_auth_token = self.rule['exotel_auth_token']
        self.exotel_to_number = self.rule['exotel_to_number']
        self.exotel_from_number = self.rule['exotel_from_number']
        self.sms_body = self.rule.get('exotel_message_body', '')

    def alert(self, matches):
        client = Exotel(self.exotel_account_sid, self.exotel_auth_token)

        try:
            message_body = self.rule['name'] + self.sms_body
            response = client.sms(self.rule['exotel_from_number'], self.rule['exotel_to_number'], message_body)
            if response != 200:
                raise EAException("Error posting to Exotel, response code is %s" % response)
        except RequestException:
            raise EAException("Error posting to Exotel").with_traceback(sys.exc_info()[2])
        elastalert_logger.info("Trigger sent to Exotel")

    def get_info(self):
        return {'type': 'exotel', 'exotel_account': self.exotel_account_sid}

class TwilioAlerter(Alerter):
    required_options = frozenset(['twilio_account_sid', 'twilio_auth_token', 'twilio_to_number', 'twilio_from_number'])

    def __init__(self, rule):
        super(TwilioAlerter, self).__init__(rule)
        self.twilio_account_sid = self.rule['twilio_account_sid']
        self.twilio_auth_token = self.rule['twilio_auth_token']
        self.twilio_to_number = self.rule['twilio_to_number']
        self.twilio_from_number = self.rule['twilio_from_number']

    def alert(self, matches):
        client = TwilioClient(self.twilio_account_sid, self.twilio_auth_token)

        try:
            client.messages.create(body=self.rule['name'],
                                   to=self.twilio_to_number,
                                   from_=self.twilio_from_number)

        except TwilioRestException as e:
            raise EAException("Error posting to twilio: %s" % e)

        elastalert_logger.info("Trigger sent to Twilio")

    def get_info(self):
        return {'type': 'twilio',
                'twilio_client_name': self.twilio_from_number}

class VictorOpsAlerter(Alerter):
    """ Creates a VictorOps Incident for each alert """
    required_options = frozenset(['victorops_api_key', 'victorops_routing_key', 'victorops_message_type'])

    def __init__(self, rule):
        super(VictorOpsAlerter, self).__init__(rule)
        self.victorops_api_key = self.rule['victorops_api_key']
        self.victorops_routing_key = self.rule['victorops_routing_key']
        self.victorops_message_type = self.rule['victorops_message_type']
        self.victorops_entity_id = self.rule.get('victorops_entity_id', None)
        self.victorops_entity_display_name = self.rule.get('victorops_entity_display_name', 'no entity display name')
        self.url = 'https://alert.victorops.com/integrations/generic/20131114/alert/%s/%s' % (
            self.victorops_api_key, self.victorops_routing_key)
        self.victorops_proxy = self.rule.get('victorops_proxy', None)

    def alert(self, matches):
        body = self.create_alert_body(matches)

        # post to victorops
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.victorops_proxy} if self.victorops_proxy else None
        payload = {
            "message_type": self.victorops_message_type,
            "entity_display_name": self.victorops_entity_display_name,
            "monitoring_tool": "ElastAlert",
            "state_message": body
        }
        if self.victorops_entity_id:
            payload["entity_id"] = self.victorops_entity_id

        try:
            response = requests.post(self.url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to VictorOps: %s" % e)
        elastalert_logger.info("Trigger sent to VictorOps")

    def get_info(self):
        return {'type': 'victorops',
                'victorops_routing_key': self.victorops_routing_key}

class TelegramAlerter(Alerter):
    """ Send a Telegram message via bot api for each alert """
    required_options = frozenset(['telegram_bot_token', 'telegram_room_id'])

    def __init__(self, rule):
        super(TelegramAlerter, self).__init__(rule)
        self.telegram_bot_token = self.rule['telegram_bot_token']
        self.telegram_room_id = self.rule['telegram_room_id']
        self.telegram_api_url = self.rule.get('telegram_api_url', 'api.telegram.org')
        self.url = 'https://%s/bot%s/%s' % (self.telegram_api_url, self.telegram_bot_token, "sendMessage")
        self.telegram_proxy = self.rule.get('telegram_proxy', None)
        self.telegram_proxy_login = self.rule.get('telegram_proxy_login', None)
        self.telegram_proxy_password = self.rule.get('telegram_proxy_pass', None)

    def alert(self, matches):
        body = '⚠ *%s* ⚠ ```\n' % (self.create_title(matches))
        for match in matches:
            body += str(BasicMatchString(self.rule, match))
            # Separate text of aggregated alerts with dashes
            if len(matches) > 1:
                body += '\n----------------------------------------\n'
        if len(body) > 4095:
            body = body[0:4000] + "\n⚠ *message was cropped according to telegram limits!* ⚠"
        body += ' ```'

        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.telegram_proxy} if self.telegram_proxy else None
        auth = HTTPProxyAuth(self.telegram_proxy_login, self.telegram_proxy_password) if self.telegram_proxy_login else None
        payload = {
            'chat_id': self.telegram_room_id,
            'text': body,
            'parse_mode': 'markdown',
            'disable_web_page_preview': True
        }

        try:
            response = requests.post(self.url, data=json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies, auth=auth)
            warnings.resetwarnings()
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to Telegram: %s. Details: %s" % (e, "" if e.response is None else e.response.text))

        elastalert_logger.info(
            "Alert sent to Telegram room %s" % self.telegram_room_id)

    def get_info(self):
        return {'type': 'telegram',
                'telegram_room_id': self.telegram_room_id}

class GoogleChatAlerter(Alerter):
    """ Send a notification via Google Chat webhooks """
    required_options = frozenset(['googlechat_webhook_url'])

    def __init__(self, rule):
        super(GoogleChatAlerter, self).__init__(rule)
        self.googlechat_webhook_url = self.rule['googlechat_webhook_url']
        if isinstance(self.googlechat_webhook_url, str):
            self.googlechat_webhook_url = [self.googlechat_webhook_url]
        self.googlechat_format = self.rule.get('googlechat_format', 'basic')
        self.googlechat_header_title = self.rule.get('googlechat_header_title', None)
        self.googlechat_header_subtitle = self.rule.get('googlechat_header_subtitle', None)
        self.googlechat_header_image = self.rule.get('googlechat_header_image', None)
        self.googlechat_footer_kibanalink = self.rule.get('googlechat_footer_kibanalink', None)

    def create_header(self):
        header = None
        if self.googlechat_header_title:
            header = {
                "title": self.googlechat_header_title,
                "subtitle": self.googlechat_header_subtitle,
                "imageUrl": self.googlechat_header_image
            }
        return header

    def create_footer(self):
        footer = None
        if self.googlechat_footer_kibanalink:
            footer = {"widgets": [{
                "buttons": [{
                    "textButton": {
                        "text": "VISIT KIBANA",
                        "onClick": {
                            "openLink": {
                                "url": self.googlechat_footer_kibanalink
                            }
                        }
                    }
                }]
            }]
            }
        return footer

    def create_card(self, matches):
        card = {"cards": [{
            "sections": [{
                "widgets": [
                    {"textParagraph": {"text": self.create_alert_body(matches)}}
                ]}
            ]}
        ]}

        # Add the optional header
        header = self.create_header()
        if header:
            card['cards'][0]['header'] = header

        # Add the optional footer
        footer = self.create_footer()
        if footer:
            card['cards'][0]['sections'].append(footer)
        return card

    def create_basic(self, matches):
        body = self.create_alert_body(matches)
        return {'text': body}

    def alert(self, matches):
        # Format message
        if self.googlechat_format == 'card':
            message = self.create_card(matches)
        else:
            message = self.create_basic(matches)

        # Post to webhook
        headers = {'content-type': 'application/json'}
        for url in self.googlechat_webhook_url:
            try:
                response = requests.post(url, data=json.dumps(message), headers=headers)
                response.raise_for_status()
            except RequestException as e:
                raise EAException("Error posting to google chat: {}".format(e))
        elastalert_logger.info("Alert sent to Google Chat!")

    def get_info(self):
        return {'type': 'googlechat',
                'googlechat_webhook_url': self.googlechat_webhook_url}

class GitterAlerter(Alerter):
    """ Creates a Gitter activity message for each alert """
    required_options = frozenset(['gitter_webhook_url'])

    def __init__(self, rule):
        super(GitterAlerter, self).__init__(rule)
        self.gitter_webhook_url = self.rule['gitter_webhook_url']
        self.gitter_proxy = self.rule.get('gitter_proxy', None)
        self.gitter_msg_level = self.rule.get('gitter_msg_level', 'error')

    def alert(self, matches):
        body = self.create_alert_body(matches)

        # post to Gitter
        headers = {'content-type': 'application/json'}
        # set https proxy, if it was provided
        proxies = {'https': self.gitter_proxy} if self.gitter_proxy else None
        payload = {
            'message': body,
            'level': self.gitter_msg_level
        }

        try:
            response = requests.post(self.gitter_webhook_url, json.dumps(payload, cls=DateTimeEncoder), headers=headers, proxies=proxies)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to Gitter: %s" % e)
        elastalert_logger.info("Alert sent to Gitter")

    def get_info(self):
        return {'type': 'gitter',
                'gitter_webhook_url': self.gitter_webhook_url}

class ServiceNowAlerter(Alerter):
    """ Creates a ServiceNow alert """
    required_options = set([
        'username',
        'password',
        'servicenow_rest_url',
        'short_description',
        'comments',
        'assignment_group',
        'category',
        'subcategory',
        'cmdb_ci',
        'caller_id'
    ])

    def __init__(self, rule):
        super(ServiceNowAlerter, self).__init__(rule)
        self.servicenow_rest_url = self.rule['servicenow_rest_url']
        self.servicenow_proxy = self.rule.get('servicenow_proxy', None)

    def alert(self, matches):
        for match in matches:
            # Parse everything into description.
            description = str(BasicMatchString(self.rule, match))

        # Set proper headers
        headers = {
            "Content-Type": "application/json",
            "Accept": "application/json;charset=utf-8"
        }
        proxies = {'https': self.servicenow_proxy} if self.servicenow_proxy else None
        payload = {
            "description": description,
            "short_description": self.rule['short_description'],
            "comments": self.rule['comments'],
            "assignment_group": self.rule['assignment_group'],
            "category": self.rule['category'],
            "subcategory": self.rule['subcategory'],
            "cmdb_ci": self.rule['cmdb_ci'],
            "caller_id": self.rule["caller_id"]
        }
        try:
            response = requests.post(
                self.servicenow_rest_url,
                auth=(self.rule['username'], self.rule['password']),
                headers=headers,
                data=json.dumps(payload, cls=DateTimeEncoder),
                proxies=proxies
            )
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to ServiceNow: %s" % e)
        elastalert_logger.info("Alert sent to ServiceNow")

    def get_info(self):
        return {'type': 'ServiceNow',
                'self.servicenow_rest_url': self.servicenow_rest_url}

class AlertaAlerter(Alerter):
    """ Creates an Alerta event for each alert """
    required_options = frozenset(['alerta_api_url'])

    def __init__(self, rule):
        super(AlertaAlerter, self).__init__(rule)

        # Setup defaul parameters
        self.url = self.rule.get('alerta_api_url', None)
        self.api_key = self.rule.get('alerta_api_key', None)
        self.timeout = self.rule.get('alerta_timeout', 86400)
        self.use_match_timestamp = self.rule.get('alerta_use_match_timestamp', False)
        self.use_qk_as_resource = self.rule.get('alerta_use_qk_as_resource', False)
        self.verify_ssl = not self.rule.get('alerta_api_skip_ssl', False)
        self.missing_text = self.rule.get('alert_missing_value', '<MISSING VALUE>')

        # Fill up default values of the API JSON payload
        self.severity = self.rule.get('alerta_severity', 'warning')
        self.resource = self.rule.get('alerta_resource', 'elastalert')
        self.environment = self.rule.get('alerta_environment', 'Production')
        self.origin = self.rule.get('alerta_origin', 'elastalert')
        self.service = self.rule.get('alerta_service', ['elastalert'])
        self.text = self.rule.get('alerta_text', 'elastalert')
        self.type = self.rule.get('alerta_type', 'elastalert')
        self.event = self.rule.get('alerta_event', 'elastalert')
        self.correlate = self.rule.get('alerta_correlate', [])
        self.tags = self.rule.get('alerta_tags', [])
        self.group = self.rule.get('alerta_group', '')
        self.attributes_keys = self.rule.get('alerta_attributes_keys', [])
        self.attributes_values = self.rule.get('alerta_attributes_values', [])
        self.value = self.rule.get('alerta_value', '')

    def alert(self, matches):
        # Override the resource if requested
        if self.use_qk_as_resource and 'query_key' in self.rule and lookup_es_key(matches[0], self.rule['query_key']):
            self.resource = lookup_es_key(matches[0], self.rule['query_key'])

        headers = {'content-type': 'application/json'}
        if self.api_key is not None:
            headers['Authorization'] = 'Key %s' % (self.rule['alerta_api_key'])
        alerta_payload = self.get_json_payload(matches[0])

        try:
            response = requests.post(self.url, data=alerta_payload, headers=headers, verify=self.verify_ssl)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to Alerta: %s" % e)
        elastalert_logger.info("Alert sent to Alerta")

    def create_default_title(self, matches):
        title = '%s' % (self.rule['name'])
        # If the rule has a query_key, add that value
        if 'query_key' in self.rule:
            qk = matches[0].get(self.rule['query_key'])
            if qk:
                title += '.%s' % (qk)
        return title

    def get_info(self):
        return {'type': 'alerta',
                'alerta_url': self.url}

    def get_json_payload(self, match):
        """
            Builds the API Create Alert body, as in
            http://alerta.readthedocs.io/en/latest/api/reference.html#create-an-alert

            For the values that could have references to fields on the match, resolve those references.

        """

        # Using default text and event title if not defined in rule
        alerta_text = self.rule['type'].get_match_str([match]) if self.text == '' else resolve_string(self.text, match, self.missing_text)
        alerta_event = self.create_default_title([match]) if self.event == '' else resolve_string(self.event, match, self.missing_text)

        match_timestamp = lookup_es_key(match, self.rule.get('timestamp_field', '@timestamp'))
        if match_timestamp is None:
            match_timestamp = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S.%fZ")
        if self.use_match_timestamp:
            createTime = ts_to_dt(match_timestamp).strftime("%Y-%m-%dT%H:%M:%S.%fZ")
        else:
            createTime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S.%fZ")

        alerta_payload_dict = {
            'resource': resolve_string(self.resource, match, self.missing_text),
            'severity': self.severity,
            'timeout': self.timeout,
            'createTime': createTime,
            'type': self.type,
            'environment': resolve_string(self.environment, match, self.missing_text),
            'origin': resolve_string(self.origin, match, self.missing_text),
            'group': resolve_string(self.group, match, self.missing_text),
            'event': alerta_event,
            'text': alerta_text,
            'value': resolve_string(self.value, match, self.missing_text),
            'service': [resolve_string(a_service, match, self.missing_text) for a_service in self.service],
            'tags': [resolve_string(a_tag, match, self.missing_text) for a_tag in self.tags],
            'correlate': [resolve_string(an_event, match, self.missing_text) for an_event in self.correlate],
            'attributes': dict(list(zip(self.attributes_keys,
                                        [resolve_string(a_value, match, self.missing_text) for a_value in self.attributes_values]))),
            'rawData': self.create_alert_body([match]),
        }

        try:
            payload = json.dumps(alerta_payload_dict, cls=DateTimeEncoder)
        except Exception as e:
            raise Exception("Error building Alerta request: %s" % e)
        return payload

class HTTPPostAlerter(Alerter):
    """ Requested elasticsearch indices are sent by HTTP POST. Encoded with JSON. """

    def __init__(self, rule):
        super(HTTPPostAlerter, self).__init__(rule)
        post_url = self.rule.get('http_post_url')
        if isinstance(post_url, str):
            post_url = [post_url]
        self.post_url = post_url
        self.post_proxy = self.rule.get('http_post_proxy')
        self.post_payload = self.rule.get('http_post_payload', {})
        self.post_static_payload = self.rule.get('http_post_static_payload', {})
        self.post_all_values = self.rule.get('http_post_all_values', not self.post_payload)
        self.post_http_headers = self.rule.get('http_post_headers', {})
        self.timeout = self.rule.get('http_post_timeout', 10)

    def alert(self, matches):
        """ Each match will trigger a POST to the specified endpoint(s). """
        for match in matches:
            payload = match if self.post_all_values else {}
            payload.update(self.post_static_payload)
            for post_key, es_key in list(self.post_payload.items()):
                payload[post_key] = lookup_es_key(match, es_key)
            headers = {
                "Content-Type": "application/json",
                "Accept": "application/json;charset=utf-8"
            }
            headers.update(self.post_http_headers)
            proxies = {'https': self.post_proxy} if self.post_proxy else None
            for url in self.post_url:
                try:
                    response = requests.post(url, data=json.dumps(payload, cls=DateTimeEncoder),
                                             headers=headers, proxies=proxies, timeout=self.timeout)
                    response.raise_for_status()
                except RequestException as e:
                    raise EAException("Error posting HTTP Post alert: %s" % e)
            elastalert_logger.info("HTTP Post alert sent.")

    def get_info(self):
        return {'type': 'http_post',
                'http_post_webhook_url': self.post_url}

class StrideHTMLParser(HTMLParser):
    """Parse html into stride's fabric structure"""

    def __init__(self):
        """
        Define a couple markup place holders.
        """
        self.content = []
        self.mark = None
        HTMLParser.__init__(self)

    def handle_starttag(self, tag, attrs):
        """Identify and verify starting tag is fabric compatible."""
        if tag == 'b' or tag == 'strong':
            self.mark = dict(type='strong')
        if tag == 'u':
            self.mark = dict(type='underline')
        if tag == 'a':
            self.mark = dict(type='link', attrs=dict(attrs))

    def handle_endtag(self, tag):
        """Clear mark on endtag."""
        self.mark = None

    def handle_data(self, data):
        """Construct data node for our data."""
        node = dict(type='text', text=data)
        if self.mark:
            node['marks'] = [self.mark]
        self.content.append(node)

class StrideAlerter(Alerter):
    """ Creates a Stride conversation message for each alert """

    required_options = frozenset(
        ['stride_access_token', 'stride_cloud_id', 'stride_conversation_id'])

    def __init__(self, rule):
        super(StrideAlerter, self).__init__(rule)

        self.stride_access_token = self.rule['stride_access_token']
        self.stride_cloud_id = self.rule['stride_cloud_id']
        self.stride_conversation_id = self.rule['stride_conversation_id']
        self.stride_ignore_ssl_errors = self.rule.get('stride_ignore_ssl_errors', False)
        self.stride_proxy = self.rule.get('stride_proxy', None)
        self.url = 'https://api.atlassian.com/site/%s/conversation/%s/message' % (
            self.stride_cloud_id, self.stride_conversation_id)

    def alert(self, matches):
        body = self.create_alert_body(matches).strip()

        # parse body with StrideHTMLParser
        parser = StrideHTMLParser()
        parser.feed(body)

        # Post to Stride
        headers = {
            'content-type': 'application/json',
            'Authorization': 'Bearer {}'.format(self.stride_access_token)
        }

        # set https proxy, if it was provided
        proxies = {'https': self.stride_proxy} if self.stride_proxy else None

        # build stride json payload
        # https://developer.atlassian.com/cloud/stride/apis/document/structure/
        payload = {'body': {'version': 1, 'type': "doc", 'content': [
            {'type': "panel", 'attrs': {'panelType': "warning"}, 'content': [
                {'type': 'paragraph', 'content': parser.content}
            ]}
        ]}}

        try:
            if self.stride_ignore_ssl_errors:
                requests.packages.urllib3.disable_warnings()
            response = requests.post(
                self.url, data=json.dumps(payload, cls=DateTimeEncoder),
                headers=headers, verify=not self.stride_ignore_ssl_errors,
                proxies=proxies)
            warnings.resetwarnings()
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to Stride: %s" % e)
        elastalert_logger.info(
            "Alert sent to Stride conversation %s" % self.stride_conversation_id)

    def get_info(self):
        return {'type': 'stride',
                'stride_cloud_id': self.stride_cloud_id,
                'stride_converstation_id': self.stride_converstation_id}

class LineNotifyAlerter(Alerter):
    """ Created a Line Notify for each alert """
    required_option = frozenset(["linenotify_access_token"])

    def __init__(self, rule):
        super(LineNotifyAlerter, self).__init__(rule)
        self.linenotify_access_token = self.rule["linenotify_access_token"]

    def alert(self, matches):
        body = self.create_alert_body(matches)
        # post to Line Notify
        headers = {
            "Content-Type": "application/x-www-form-urlencoded",
            "Authorization": "Bearer {}".format(self.linenotify_access_token)
        }
        payload = {
            "message": body
        }
        try:
            response = requests.post("https://notify-api.line.me/api/notify", data=payload, headers=headers)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error posting to Line Notify: %s" % e)
        elastalert_logger.info("Alert sent to Line Notify")

    def get_info(self):
        return {"type": "linenotify", "linenotify_access_token": self.linenotify_access_token}

class HiveAlerter(Alerter):
    """
    Use matched data to create alerts containing observables in an instance of TheHive
    """

    required_options = set(['hive_connection', 'hive_alert_config'])

    def alert(self, matches):

        connection_details = self.rule['hive_connection']

        for match in matches:
            context = {'rule': self.rule, 'match': match}

            artifacts = []
            for mapping in self.rule.get('hive_observable_data_mapping', []):
                for observable_type, match_data_key in mapping.items():
                    try:
                        match_data_keys = re.findall(r'\{match\[([^\]]*)\]', match_data_key)
                        rule_data_keys = re.findall(r'\{rule\[([^\]]*)\]', match_data_key)
                        data_keys = match_data_keys + rule_data_keys
                        context_keys = list(context['match'].keys()) + list(context['rule'].keys())
                        if all([True if k in context_keys else False for k in data_keys]):
                            artifact = {'tlp': 2, 'tags': [], 'message': None, 'dataType': observable_type,
                                        'data': match_data_key.format(**context)}
                            artifacts.append(artifact)
                    except KeyError:
                        raise KeyError('\nformat string\n{}\nmatch data\n{}'.format(match_data_key, context))

            alert_config = {
                'artifacts': artifacts,
                'sourceRef': str(uuid.uuid4())[0:6],
                'customFields': {},
                'caseTemplate': None,
                'title': '{rule[index]}_{rule[name]}'.format(**context),
                'date': int(time.time()) * 1000
            }
            alert_config.update(self.rule.get('hive_alert_config', {}))
            custom_fields = {}
            for alert_config_field, alert_config_value in alert_config.items():
                if alert_config_field == 'customFields':
                    n = 0
                    for cf_key, cf_value in alert_config_value.items():
                        cf = {'order': n, cf_value['type']: cf_value['value'].format(**context)}
                        n += 1
                        custom_fields[cf_key] = cf
                elif isinstance(alert_config_value, str):
                    alert_config[alert_config_field] = alert_config_value.format(**context)
                elif isinstance(alert_config_value, (list, tuple)):
                    formatted_list = []
                    for element in alert_config_value:
                        try:
                            formatted_list.append(element.format(**context))
                        except (AttributeError, KeyError, IndexError):
                            formatted_list.append(element)
                    alert_config[alert_config_field] = formatted_list
            if custom_fields:
                alert_config['customFields'] = custom_fields

            alert_body = json.dumps(alert_config, indent=4, sort_keys=True)
            req = '{}:{}/api/alert'.format(connection_details['hive_host'], connection_details['hive_port'])
            headers = {'Content-Type': 'application/json', 'Authorization': 'Bearer {}'.format(connection_details.get('hive_apikey', ''))}
            proxies = connection_details.get('hive_proxies', {'http': '', 'https': ''})
            verify = connection_details.get('hive_verify', False)
            response = requests.post(req, headers=headers, data=alert_body, proxies=proxies, verify=verify)

            if response.status_code != 201:
                raise Exception('alert not successfully created in TheHive\n{}'.format(response.text))

    def get_info(self):

        return {
            'type': 'hivealerter',
            'hive_host': self.rule.get('hive_connection', {}).get('hive_host', '')
        }

class DingTalkAlerter(Alerter):

    required_options = frozenset(['dingtalk_webhook', 'dingtalk_msgtype'])

    def __init__(self, rule):
        super(DingTalkAlerter, self).__init__(rule)
        self.dingtalk_webhook_url = self.rule['dingtalk_webhook']
        self.dingtalk_msgtype = self.rule.get('dingtalk_msgtype', 'text')
        self.dingtalk_isAtAll = self.rule.get('dingtalk_isAtAll', False)
        self.digtalk_title = self.rule.get('dingtalk_title', '')

    def format_body(self, body):
        return body.encode('utf8')

    def alert(self, matches):
        headers = {
            "Content-Type": "application/json",
            "Accept": "application/json;charset=utf-8"
        }
        body = self.create_alert_body(matches)
        payload = {
            "msgtype": self.dingtalk_msgtype,
            "text": {
                "content": body
            },
            "at": {
                "isAtAll":False
            }
        }
        try:
            response = requests.post(self.dingtalk_webhook_url, 
                        data=json.dumps(payload, cls=DateTimeEncoder),
                        headers=headers)
            response.raise_for_status()
        except RequestException as e:
            raise EAException("Error request to Dingtalk: {0}".format(str(e)))

    def get_info(self):
        return {
            "type": "dingtalk",
            "dingtalk_webhook": self.dingtalk_webhook_url
        }

./elastalert/patch/loaders.py

add 'dingtalk': alerts.DingTalkAlerter

# -*- coding: utf-8 -*-
import copy
import datetime
import hashlib
import logging
import os
import sys

import jsonschema
import yaml
import yaml.scanner
from staticconf.loader import yaml_loader

from . import alerts
from . import enhancements
from . import ruletypes
from .opsgenie import OpsGenieAlerter
from .util import dt_to_ts
from .util import dt_to_ts_with_format
from .util import dt_to_unix
from .util import dt_to_unixms
from .util import EAException
from .util import get_module
from .util import ts_to_dt
from .util import ts_to_dt_with_format
from .util import unix_to_dt
from .util import unixms_to_dt
from .zabbix import ZabbixAlerter

class RulesLoader(object):
    # import rule dependency
    import_rules = {}

    # Required global (config.yaml) configuration options for the loader
    required_globals = frozenset([])

    # Required local (rule.yaml) configuration options
    required_locals = frozenset(['alert', 'type', 'name', 'index'])

    # Used to map the names of rules to their classes
    rules_mapping = {
        'frequency': ruletypes.FrequencyRule,
        'any': ruletypes.AnyRule,
        'spike': ruletypes.SpikeRule,
        'blacklist': ruletypes.BlacklistRule,
        'whitelist': ruletypes.WhitelistRule,
        'change': ruletypes.ChangeRule,
        'flatline': ruletypes.FlatlineRule,
        'new_term': ruletypes.NewTermsRule,
        'cardinality': ruletypes.CardinalityRule,
        'metric_aggregation': ruletypes.MetricAggregationRule,
        'percentage_match': ruletypes.PercentageMatchRule,
        'spike_aggregation': ruletypes.SpikeMetricAggregationRule,
    }

    # Used to map names of alerts to their classes
    alerts_mapping = {
        'email': alerts.EmailAlerter,
        'jira': alerts.JiraAlerter,
        'opsgenie': OpsGenieAlerter,
        'stomp': alerts.StompAlerter,
        'debug': alerts.DebugAlerter,
        'command': alerts.CommandAlerter,
        'sns': alerts.SnsAlerter,
        'hipchat': alerts.HipChatAlerter,
        'stride': alerts.StrideAlerter,
        'ms_teams': alerts.MsTeamsAlerter,
        'slack': alerts.SlackAlerter,
        'mattermost': alerts.MattermostAlerter,
        'pagerduty': alerts.PagerDutyAlerter,
        'exotel': alerts.ExotelAlerter,
        'twilio': alerts.TwilioAlerter,
        'victorops': alerts.VictorOpsAlerter,
        'telegram': alerts.TelegramAlerter,
        'googlechat': alerts.GoogleChatAlerter,
        'gitter': alerts.GitterAlerter,
        'servicenow': alerts.ServiceNowAlerter,
        'alerta': alerts.AlertaAlerter,
        'post': alerts.HTTPPostAlerter,
        'hivealerter': alerts.HiveAlerter,
        'linenotify': alerts.LineNotifyAlerter,
        'zabbix': ZabbixAlerter,
        'dingtalk': alerts.DingTalkAlerter
    }

    # A partial ordering of alert types. Relative order will be preserved in the resulting alerts list
    # For example, jira goes before email so the ticket # will be added to the resulting email.
    alerts_order = {
        'jira': 0,
        'email': 1
    }

    base_config = {}

    def __init__(self, conf):
        # schema for rule yaml
        self.rule_schema = jsonschema.Draft7Validator(
            yaml.load(open(os.path.join(os.path.dirname(__file__), 'schema.yaml')), Loader=yaml.FullLoader))

        self.base_config = copy.deepcopy(conf)

    def load(self, conf, args=None):
        """
        Discover and load all the rules as defined in the conf and args.
        :param dict conf: Configuration dict
        :param dict args: Arguments dict
        :return: List of rules
        :rtype: list
        """
        names = []
        use_rule = None if args is None else args.rule

        # Load each rule configuration file
        rules = []
        rule_files = self.get_names(conf, use_rule)
        for rule_file in rule_files:
            try:
                rule = self.load_configuration(rule_file, conf, args)
                # A rule failed to load, don't try to process it
                if not rule:
                    logging.error('Invalid rule file skipped: %s' % rule_file)
                    continue
                # By setting "is_enabled: False" in rule file, a rule is easily disabled
                if 'is_enabled' in rule and not rule['is_enabled']:
                    continue
                if rule['name'] in names:
                    raise EAException('Duplicate rule named %s' % (rule['name']))
            except EAException as e:
                raise EAException('Error loading file %s: %s' % (rule_file, e))

            rules.append(rule)
            names.append(rule['name'])

        return rules

    def get_names(self, conf, use_rule=None):
        """
        Return a list of rule names that can be passed to `get_yaml` to retrieve.
        :param dict conf: Configuration dict
        :param str use_rule: Limit to only specified rule
        :return: A list of rule names
        :rtype: list
        """
        raise NotImplementedError()

    def get_hashes(self, conf, use_rule=None):
        """
        Discover and get the hashes of all the rules as defined in the conf.
        :param dict conf: Configuration
        :param str use_rule: Limit to only specified rule
        :return: Dict of rule name to hash
        :rtype: dict
        """
        raise NotImplementedError()

    def get_yaml(self, filename):
        """
        Get and parse the yaml of the specified rule.
        :param str filename: Rule to get the yaml
        :return: Rule YAML dict
        :rtype: dict
        """
        raise NotImplementedError()

    def get_import_rule(self, rule):
        """
        Retrieve the name of the rule to import.
        :param dict rule: Rule dict
        :return: rule name that will all `get_yaml` to retrieve the yaml of the rule
        :rtype: str
        """
        return rule['import']

    def load_configuration(self, filename, conf, args=None):
        """ Load a yaml rule file and fill in the relevant fields with objects.

        :param str filename: The name of a rule configuration file.
        :param dict conf: The global configuration dictionary, used for populating defaults.
        :param dict args: Arguments
        :return: The rule configuration, a dictionary.
        """
        rule = self.load_yaml(filename)
        self.load_options(rule, conf, filename, args)
        self.load_modules(rule, args)
        return rule

    def load_yaml(self, filename):
        """
        Load the rule including all dependency rules.
        :param str filename: Rule to load
        :return: Loaded rule dict
        :rtype: dict
        """
        rule = {
            'rule_file': filename,
        }

        self.import_rules.pop(filename, None)  # clear `filename` dependency
        while True:
            loaded = self.get_yaml(filename)

            # Special case for merging filters - if both files specify a filter merge (AND) them
            if 'filter' in rule and 'filter' in loaded:
                rule['filter'] = loaded['filter'] + rule['filter']

            loaded.update(rule)
            rule = loaded
            if 'import' in rule:
                # Find the path of the next file.
                import_filename = self.get_import_rule(rule)
                # set dependencies
                rules = self.import_rules.get(filename, [])
                rules.append(import_filename)
                self.import_rules[filename] = rules
                filename = import_filename
                del (rule['import'])  # or we could go on forever!
            else:
                break

        return rule

    def load_options(self, rule, conf, filename, args=None):
        """ Converts time objects, sets defaults, and validates some settings.

        :param rule: A dictionary of parsed YAML from a rule config file.
        :param conf: The global configuration dictionary, used for populating defaults.
        :param filename: Name of the rule
        :param args: Arguments
        """
        self.adjust_deprecated_values(rule)

        try:
            self.rule_schema.validate(rule)
        except jsonschema.ValidationError as e:
            raise EAException("Invalid Rule file: %s\n%s" % (filename, e))

        try:
            # Set all time based parameters
            if 'timeframe' in rule:
                rule['timeframe'] = datetime.timedelta(**rule['timeframe'])
            if 'realert' in rule:
                rule['realert'] = datetime.timedelta(**rule['realert'])
            else:
                if 'aggregation' in rule:
                    rule['realert'] = datetime.timedelta(minutes=0)
                else:
                    rule['realert'] = datetime.timedelta(minutes=1)
            if 'aggregation' in rule and not rule['aggregation'].get('schedule'):
                rule['aggregation'] = datetime.timedelta(**rule['aggregation'])
            if 'query_delay' in rule:
                rule['query_delay'] = datetime.timedelta(**rule['query_delay'])
            if 'buffer_time' in rule:
                rule['buffer_time'] = datetime.timedelta(**rule['buffer_time'])
            if 'run_every' in rule:
                rule['run_every'] = datetime.timedelta(**rule['run_every'])
            if 'bucket_interval' in rule:
                rule['bucket_interval_timedelta'] = datetime.timedelta(**rule['bucket_interval'])
            if 'exponential_realert' in rule:
                rule['exponential_realert'] = datetime.timedelta(**rule['exponential_realert'])
            if 'kibana4_start_timedelta' in rule:
                rule['kibana4_start_timedelta'] = datetime.timedelta(**rule['kibana4_start_timedelta'])
            if 'kibana4_end_timedelta' in rule:
                rule['kibana4_end_timedelta'] = datetime.timedelta(**rule['kibana4_end_timedelta'])
            if 'kibana_discover_from_timedelta' in rule:
                rule['kibana_discover_from_timedelta'] = datetime.timedelta(**rule['kibana_discover_from_timedelta'])
            if 'kibana_discover_to_timedelta' in rule:
                rule['kibana_discover_to_timedelta'] = datetime.timedelta(**rule['kibana_discover_to_timedelta'])
        except (KeyError, TypeError) as e:
            raise EAException('Invalid time format used: %s' % e)

        # Set defaults, copy defaults from config.yaml
        for key, val in list(self.base_config.items()):
            rule.setdefault(key, val)
        rule.setdefault('name', os.path.splitext(filename)[0])
        rule.setdefault('realert', datetime.timedelta(seconds=0))
        rule.setdefault('aggregation', datetime.timedelta(seconds=0))
        rule.setdefault('query_delay', datetime.timedelta(seconds=0))
        rule.setdefault('timestamp_field', '@timestamp')
        rule.setdefault('filter', [])
        rule.setdefault('timestamp_type', 'iso')
        rule.setdefault('timestamp_format', '%Y-%m-%dT%H:%M:%SZ')
        rule.setdefault('_source_enabled', True)
        rule.setdefault('use_local_time', True)
        rule.setdefault('description', "")

        # Set timestamp_type conversion function, used when generating queries and processing hits
        rule['timestamp_type'] = rule['timestamp_type'].strip().lower()
        if rule['timestamp_type'] == 'iso':
            rule['ts_to_dt'] = ts_to_dt
            rule['dt_to_ts'] = dt_to_ts
        elif rule['timestamp_type'] == 'unix':
            rule['ts_to_dt'] = unix_to_dt
            rule['dt_to_ts'] = dt_to_unix
        elif rule['timestamp_type'] == 'unix_ms':
            rule['ts_to_dt'] = unixms_to_dt
            rule['dt_to_ts'] = dt_to_unixms
        elif rule['timestamp_type'] == 'custom':
            def _ts_to_dt_with_format(ts):
                return ts_to_dt_with_format(ts, ts_format=rule['timestamp_format'])

            def _dt_to_ts_with_format(dt):
                ts = dt_to_ts_with_format(dt, ts_format=rule['timestamp_format'])
                if 'timestamp_format_expr' in rule:
                    # eval expression passing 'ts' and 'dt'
                    return eval(rule['timestamp_format_expr'], {'ts': ts, 'dt': dt})
                else:
                    return ts

            rule['ts_to_dt'] = _ts_to_dt_with_format
            rule['dt_to_ts'] = _dt_to_ts_with_format
        else:
            raise EAException('timestamp_type must be one of iso, unix, or unix_ms')

        # Add support for client ssl certificate auth
        if 'verify_certs' in conf:
            rule.setdefault('verify_certs', conf.get('verify_certs'))
            rule.setdefault('ca_certs', conf.get('ca_certs'))
            rule.setdefault('client_cert', conf.get('client_cert'))
            rule.setdefault('client_key', conf.get('client_key'))

        # Set HipChat options from global config
        rule.setdefault('hipchat_msg_color', 'red')
        rule.setdefault('hipchat_domain', 'api.hipchat.com')
        rule.setdefault('hipchat_notify', True)
        rule.setdefault('hipchat_from', '')
        rule.setdefault('hipchat_ignore_ssl_errors', False)

        # Make sure we have required options
        if self.required_locals - frozenset(list(rule.keys())):
            raise EAException('Missing required option(s): %s' % (', '.join(self.required_locals - frozenset(list(rule.keys())))))

        if 'include' in rule and type(rule['include']) != list:
            raise EAException('include option must be a list')

        raw_query_key = rule.get('query_key')
        if isinstance(raw_query_key, list):
            if len(raw_query_key) > 1:
                rule['compound_query_key'] = raw_query_key
                rule['query_key'] = ','.join(raw_query_key)
            elif len(raw_query_key) == 1:
                rule['query_key'] = raw_query_key[0]
            else:
                del(rule['query_key'])

        if isinstance(rule.get('aggregation_key'), list):
            rule['compound_aggregation_key'] = rule['aggregation_key']
            rule['aggregation_key'] = ','.join(rule['aggregation_key'])

        if isinstance(rule.get('compare_key'), list):
            rule['compound_compare_key'] = rule['compare_key']
            rule['compare_key'] = ','.join(rule['compare_key'])
        elif 'compare_key' in rule:
            rule['compound_compare_key'] = [rule['compare_key']]
        # Add QK, CK and timestamp to include
        include = rule.get('include', ['*'])
        if 'query_key' in rule:
            include.append(rule['query_key'])
        if 'compound_query_key' in rule:
            include += rule['compound_query_key']
        if 'compound_aggregation_key' in rule:
            include += rule['compound_aggregation_key']
        if 'compare_key' in rule:
            include.append(rule['compare_key'])
        if 'compound_compare_key' in rule:
            include += rule['compound_compare_key']
        if 'top_count_keys' in rule:
            include += rule['top_count_keys']
        include.append(rule['timestamp_field'])
        rule['include'] = list(set(include))

        # Check that generate_kibana_url is compatible with the filters
        if rule.get('generate_kibana_link'):
            for es_filter in rule.get('filter'):
                if es_filter:
                    if 'not' in es_filter:
                        es_filter = es_filter['not']
                    if 'query' in es_filter:
                        es_filter = es_filter['query']
                    if list(es_filter.keys())[0] not in ('term', 'query_string', 'range'):
                        raise EAException(
                            'generate_kibana_link is incompatible with filters other than term, query_string and range.'
                            'Consider creating a dashboard and using use_kibana_dashboard instead.')

        # Check that doc_type is provided if use_count/terms_query
        if rule.get('use_count_query') or rule.get('use_terms_query'):
            if 'doc_type' not in rule:
                raise EAException('doc_type must be specified.')

        # Check that query_key is set if use_terms_query
        if rule.get('use_terms_query'):
            if 'query_key' not in rule:
                raise EAException('query_key must be specified with use_terms_query')

        # Warn if use_strf_index is used with %y, %M or %D
        # (%y = short year, %M = minutes, %D = full date)
        if rule.get('use_strftime_index'):
            for token in ['%y', '%M', '%D']:
                if token in rule.get('index'):
                    logging.warning('Did you mean to use %s in the index? '
                                    'The index will be formatted like %s' % (token,
                                                                             datetime.datetime.now().strftime(
                                                                                 rule.get('index'))))

        if rule.get('scan_entire_timeframe') and not rule.get('timeframe'):
            raise EAException('scan_entire_timeframe can only be used if there is a timeframe specified')

    def load_modules(self, rule, args=None):
        """ Loads things that could be modules. Enhancements, alerts and rule type. """
        # Set match enhancements
        match_enhancements = []
        for enhancement_name in rule.get('match_enhancements', []):
            if enhancement_name in dir(enhancements):
                enhancement = getattr(enhancements, enhancement_name)
            else:
                enhancement = get_module(enhancement_name)
            if not issubclass(enhancement, enhancements.BaseEnhancement):
                raise EAException("Enhancement module %s not a subclass of BaseEnhancement" % enhancement_name)
            match_enhancements.append(enhancement(rule))
        rule['match_enhancements'] = match_enhancements

        # Convert rule type into RuleType object
        if rule['type'] in self.rules_mapping:
            rule['type'] = self.rules_mapping[rule['type']]
        else:
            rule['type'] = get_module(rule['type'])
            if not issubclass(rule['type'], ruletypes.RuleType):
                raise EAException('Rule module %s is not a subclass of RuleType' % (rule['type']))

        # Make sure we have required alert and type options
        reqs = rule['type'].required_options

        if reqs - frozenset(list(rule.keys())):
            raise EAException('Missing required option(s): %s' % (', '.join(reqs - frozenset(list(rule.keys())))))
        # Instantiate rule
        try:
            rule['type'] = rule['type'](rule, args)
        except (KeyError, EAException) as e:
            raise EAException('Error initializing rule %s: %s' % (rule['name'], e)).with_traceback(sys.exc_info()[2])
        # Instantiate alerts only if we're not in debug mode
        # In debug mode alerts are not actually sent so don't bother instantiating them
        if not args or not args.debug:
            rule['alert'] = self.load_alerts(rule, alert_field=rule['alert'])

    def load_alerts(self, rule, alert_field):
        def normalize_config(alert):
            """Alert config entries are either "alertType" or {"alertType": {"key": "data"}}.
            This function normalizes them both to the latter format. """
            if isinstance(alert, str):
                return alert, rule
            elif isinstance(alert, dict):
                name, config = next(iter(list(alert.items())))
                config_copy = copy.copy(rule)
                config_copy.update(config)  # warning, this (intentionally) mutates the rule dict
                return name, config_copy
            else:
                raise EAException()

        def create_alert(alert, alert_config):
            alert_class = self.alerts_mapping.get(alert) or get_module(alert)
            if not issubclass(alert_class, alerts.Alerter):
                raise EAException('Alert module %s is not a subclass of Alerter' % alert)
            missing_options = (rule['type'].required_options | alert_class.required_options) - frozenset(
                alert_config or [])
            if missing_options:
                raise EAException('Missing required option(s): %s' % (', '.join(missing_options)))
            return alert_class(alert_config)

        try:
            if type(alert_field) != list:
                alert_field = [alert_field]

            alert_field = [normalize_config(x) for x in alert_field]
            alert_field = sorted(alert_field, key=lambda a_b: self.alerts_order.get(a_b[0], 1))
            # Convert all alerts into Alerter objects
            alert_field = [create_alert(a, b) for a, b in alert_field]

        except (KeyError, EAException) as e:
            raise EAException('Error initiating alert %s: %s' % (rule['alert'], e)).with_traceback(sys.exc_info()[2])

        return alert_field

    @staticmethod
    def adjust_deprecated_values(rule):
        # From rename of simple HTTP alerter
        if rule.get('type') == 'simple':
            rule['type'] = 'post'
            if 'simple_proxy' in rule:
                rule['http_post_proxy'] = rule['simple_proxy']
            if 'simple_webhook_url' in rule:
                rule['http_post_url'] = rule['simple_webhook_url']
            logging.warning(
                '"simple" alerter has been renamed "post" and comptability may be removed in a future release.')

class FileRulesLoader(RulesLoader):

    # Required global (config.yaml) configuration options for the loader
    required_globals = frozenset(['rules_folder'])

    def get_names(self, conf, use_rule=None):
        # Passing a filename directly can bypass rules_folder and .yaml checks
        if use_rule and os.path.isfile(use_rule):
            return [use_rule]
        rule_folder = conf['rules_folder']
        rule_files = []
        if 'scan_subdirectories' in conf and conf['scan_subdirectories']:
            for root, folders, files in os.walk(rule_folder):
                for filename in files:
                    if use_rule and use_rule != filename:
                        continue
                    if self.is_yaml(filename):
                        rule_files.append(os.path.join(root, filename))
        else:
            for filename in os.listdir(rule_folder):
                fullpath = os.path.join(rule_folder, filename)
                if os.path.isfile(fullpath) and self.is_yaml(filename):
                    rule_files.append(fullpath)
        return rule_files

    def get_hashes(self, conf, use_rule=None):
        rule_files = self.get_names(conf, use_rule)
        rule_mod_times = {}
        for rule_file in rule_files:
            rule_mod_times[rule_file] = self.get_rule_file_hash(rule_file)
        return rule_mod_times

    def get_yaml(self, filename):
        try:
            return yaml_loader(filename)
        except yaml.scanner.ScannerError as e:
            raise EAException('Could not parse file %s: %s' % (filename, e))

    def get_import_rule(self, rule):
        """
        Allow for relative paths to the import rule.
        :param dict rule:
        :return: Path the import rule
        :rtype: str
        """
        if os.path.isabs(rule['import']):
            return rule['import']
        else:
            return os.path.join(os.path.dirname(rule['rule_file']), rule['import'])

    def get_rule_file_hash(self, rule_file):
        rule_file_hash = ''
        if os.path.exists(rule_file):
            with open(rule_file, 'rb') as fh:
                rule_file_hash = hashlib.sha1(fh.read()).digest()
            for import_rule_file in self.import_rules.get(rule_file, []):
                rule_file_hash += self.get_rule_file_hash(import_rule_file)
        return rule_file_hash

    @staticmethod
    def is_yaml(filename):
        return filename.endswith('.yaml') or filename.endswith('.yml')
nsano-rururu commented 4 years ago

@kakaNo1

With the code of elastalert-dingtalk-plugin added to alerts.py and loaders.py of ElastAlert, docker-compose overwrites the file at startup and starts the Docker container of ElastAlert. I confirmed to the point where the alert flies to dingtalk.

kakaNo1 commented 4 years ago

Thank you for your contribution. I just tested it and it started normally, but I failed to find dingtalk option in Destination. Alerts. Py and loaders. Py are configured according to what you gave above, only a little bit. you: image

me: image Is it configured this way?

This is the page that launches: image

kakaNo1 commented 4 years ago

An error occurred while I was saving the rule

elastalert_1  | 05:28:22.991Z ERROR elastalert-server:
elastalert_1  |     Routes:  Request for '/rules/:id' failed with error:
elastalert_1  |
elastalert_1  |      ReferenceError: reject is not defined

image

kakaNo1 commented 4 years ago

my docker-compose.yml

version: '3'

services:
  elastalert:
    image: 'praecoapp/elastalert-server'
    ports:
      - 3030:3030
      - 3333:3333
    volumes:
      - ./config/elastalert.yaml:/opt/elastalert/config.yaml
      - ./config/api.config.json:/opt/elastalert-server/config/config.json
      - ./rules:/opt/elastalert/rules
      - ./rule_templates:/opt/elastalert/rule_templates
      - ./ding2.py:/opt/elastalert/elastalert/alerts.py
      - ./loaders2.py:/opt/elastalert/elastalert/loaders.py
    extra_hosts:
      - 'elasticsearch:172.16.3.188'

  webapp:
    image: 'praecoapp/praeco'
    ports:
      - 8080:8080
    volumes:
      - ./public/praeco.config.json:/var/www/html/praeco.config.json
      - ./nginx_config/nginx.conf:/etc/nginx/nginx.conf
      - ./nginx_config/default.conf:/etc/nginx/conf.d/default.conf

ding2.py is your alerts.py loaders2.py is your loaders.py

nsano-rururu commented 4 years ago

With the code of elastalert-dingtalk-plugin added to alerts.py and loaders.py of ElastAlert, docker-compose overwrites the file at startup and starts the Docker container of ElastAlert. I confirmed to the point where the alert flies to dingtalk.

I added the contents of elastalert-dingtalk-plugin as is, so msg_type only supports text.

nsano-rururu commented 4 years ago

@kakaNo1

There is no information I can provide anymore. I can't say if I do my best.

nsano-rururu commented 4 years ago

Thank you for your contribution. I just tested it and it started normally, but I failed to find dingtalk option in Destination. Alerts. Py and loaders. Py are configured according to what you gave above, only a little bit. you: image

me: image Is it configured this way?

This is the page that launches: image

"-Slack" is unnecessary. It is a work mistake due to lack of sleep. sorry

nsano-rururu commented 4 years ago

I'm sleepy so i sleep

kakaNo1 commented 4 years ago

Thank you for your contribution, thank you very much, pay attention to rest!

nsano-rururu commented 4 years ago

Dingtalk when incorporated into the ElastAlert standard This issue will be closed once as we will consider the correspondence of