nin9s / elk-hole

elasticsearch, logstash and kibana configuration for pi-hole visualiziation
MIT License
204 stars 36 forks source link

vis errors (elk 7.1.1 related ?) #12

Closed auricom closed 5 years ago

auricom commented 5 years ago

I installed a brand new elastic stack on Ubuntu 18.04 in order to use your elk-hole. First of all, thank you very much for sharing those ressources.

I installed Filebeat, Logstash, Kibana v7.1.1.

Using the readme and some tutorials on internet I managed to populate indices with my pihole logs but I encountered some errors when viewing the majority of vis objects.

Some fields are not loaded correctly because they are not found in the index pattern.

Example : "DNS top domains - pihole" => I had to rename the field "domain_request" into "domain_request.keyword".

Then, the vis would work and I can view it using the provided dashboard.

I would be happy to do a pull request to share my working 7.1.1 code however, there is one vis than I can not manage to correct.

"DNS top piholed domains - pihole" => Field "blocked_domain" is not found. I understand that this field is not aggregatable as the others *.keywords that I found for the others VIS.

How would be possible to populate the elasticsearch index with a new blocked_domain.keyword ? I would think it should be put on the logstash configuration. I did not find how to do it however.

Thanks

nin9s commented 5 years ago

its not 7.1.1 related as I have running 7.1.1 in my lab. I guess its related to the incorrect mapping of your index template. Could you please verify that the template is loaded properly by going to the dev tools in kibana: GET /_template/logstash-syslog-dns

important: if the template is applied and its still not working note that you have to reindex (or delete) the present index because index templates are only applied during creation of the index.

aviationfan commented 5 years ago

I am having the same issue as auricom. Here is the output of the command in dev tools. What should I see there?

GET /_template/logstash-syslog-dns

  "logstash-syslog-dns" : {
    "order" : 0,
    "index_patterns" : [
      "logstash-syslog-dns*"
    ],
    "settings" : { },
    "mappings" : {
      "dynamic" : "true",
      "properties" : {
        "date" : {
          "format" : "MMM  d HH:mm:ss||MMM dd HH:mm:ss",
          "type" : "date"
        },
        "request_from" : {
          "type" : "ip"
        },
        "ip_response" : {
          "type" : "ip"
        },
        "source_host" : {
          "type" : "ip"
        },
        "blocked_domain" : {
          "type" : "text"
        },
        "source_port" : {
          "type" : "integer"
        },
        "ip_request" : {
          "type" : "ip"
        },
        "dns_forward_to" : {
          "type" : "ip",
          "fields" : {
            "keyword" : {
              "ignore_above" : 256,
              "type" : "keyword"
            }
          }
        },
        "pid" : {
          "type" : "integer"
        },
        "logrow" : {
          "type" : "integer"
        },
        "pihole" : {
          "type" : "ip"
        },
        "tags" : {
          "type" : "keyword",
          "fields" : {
            "keyword" : {
              "ignore_above" : 256,
              "type" : "keyword"
            }
          }
        }
      }
    },
    "aliases" : { }
  }
}
nin9s commented 5 years ago

Seems like the field is created with type keyword instead of text which would include text and keyword. Did you double check proper load of the index template by recreating the index?

You may also have to refresh the field list in kibana afterwards! Did you do both of that?

aviationfan commented 5 years ago

remove all indices and the data the contain

bitnami@elk7vm:/opt/bitnami/elasticsearch$ curl -X DELETE 'http://localhost:9200/_all' {"acknowledged":true}

Make sure the template is present GET /_template/logstash-syslog-dns

Output:

  "logstash-syslog-dns" : {
    "order" : 0,
    "index_patterns" : [
      "logstash-syslog-dns*"
    ],
    "settings" : { },
    "mappings" : {
      "dynamic" : "true",
      "properties" : {
        "date" : {
          "format" : "MMM  d HH:mm:ss||MMM dd HH:mm:ss",
          "type" : "date"
        },
        "request_from" : {
          "type" : "ip"
        },
        "ip_response" : {
          "type" : "ip"
        },
        "source_host" : {
          "type" : "ip"
        },
        "blocked_domain" : {
          "type" : "text"
        },
        "source_port" : {
          "type" : "integer"
        },
        "ip_request" : {
          "type" : "ip"
        },
        "dns_forward_to" : {
          "type" : "ip",
          "fields" : {
            "keyword" : {
              "ignore_above" : 256,
              "type" : "keyword"
            }
          }
        },
        "pid" : {
          "type" : "integer"
        },
        "logrow" : {
          "type" : "integer"
        },
        "pihole" : {
          "type" : "ip"
        },
        "tags" : {
          "type" : "keyword",
          "fields" : {
            "keyword" : {
              "ignore_above" : 256,
              "type" : "keyword"
            }
          }
        }
      }
    },
    "aliases" : { }
  }
}

Use Filebeat to push all the raw logs back into elasticsearch filebeat > logstash > elasticsearch

created the index patterns for logstash-syslog-dns* I see 67 fields after doing that however the blocked_domain.keyword does not show up

@timestamp               date
@version                 string
_id                      string
_index                   string
_score                   number
_source                  _source
_type                    string
beat.hostname            string
beat.hostname.keyword    string
beat.name                string
beat.name.keyword        string
beat.version             string
beat.version.keyword     string
blocked_domain           string
date                     date

Now I try re-importing all the saved objects from ELK-HOLE on github starting with the visualizations

This gets confusing as to which JSON is the right one... choosing Filebeat 7.1 First did the vis Then the dash

"DNS top piholed domains - pihole" does not work out of the box

Trying to select the filed manually and it shows that there is no 'blocked_domain.keyword'

Is the template not correct? I made sure the template was the very first thing applied before any data was imported or index patterns were created.

nin9s commented 5 years ago

thanks, at least this is the way I would recommend. Let me dig into this please...firdt I need to revive my esx somehow :(

nin9s commented 5 years ago

@auricom & @aviationfan could you please paste the output of:

GET /logstash-syslog-dns-2019.06?pretty

auricom commented 5 years ago

Hello, Thank you for making time to help us. As @aviationfan I deleted the previous index after importing the template in elasticsearch.

GET /logstash-syslog-dns-2019.06?pretty

  "logstash-syslog-dns-2019.06" : {
    "aliases" : { },
    "mappings" : {
      "dynamic" : "true",
      "dynamic_templates" : [
        {
          "message_field" : {
            "path_match" : "message",
            "match_mapping_type" : "string",
            "mapping" : {
              "norms" : false,
              "type" : "text"
            }
          }
        },
        {
          "string_fields" : {
            "match" : "*",
            "match_mapping_type" : "string",
            "mapping" : {
              "fields" : {
                "keyword" : {
                  "ignore_above" : 256,
                  "type" : "keyword"
                }
              },
              "norms" : false,
              "type" : "text"
            }
          }
        }
      ],
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "keyword"
        },
        "agent" : {
          "properties" : {
            "ephemeral_id" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "hostname" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "id" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "name" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "type" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "version" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "blocked_domain" : {
          "type" : "text"
        },
        "date" : {
          "type" : "date",
          "format" : "MMM  d HH:mm:ss||MMM dd HH:mm:ss"
        },
        "dns_forward_to" : {
          "type" : "ip",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "domain_request" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "domain_response" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "ecs" : {
          "properties" : {
            "version" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "geoip" : {
          "dynamic" : "true",
          "properties" : {
            "city_name" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "continent_code" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_code2" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_code3" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_name" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "dma_code" : {
              "type" : "long"
            },
            "ip" : {
              "type" : "ip"
            },
            "latitude" : {
              "type" : "half_float"
            },
            "location" : {
              "type" : "geo_point"
            },
            "longitude" : {
              "type" : "half_float"
            },
            "postal_code" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "region_code" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "region_name" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "timezone" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "host" : {
          "properties" : {
            "name" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "input" : {
          "properties" : {
            "type" : {
              "type" : "text",
              "norms" : false,
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "ip_request" : {
          "type" : "ip"
        },
        "ip_response" : {
          "type" : "ip"
        },
        "log" : {
          "properties" : {
            "file" : {
              "properties" : {
                "path" : {
                  "type" : "text",
                  "norms" : false,
                  "fields" : {
                    "keyword" : {
                      "type" : "keyword",
                      "ignore_above" : 256
                    }
                  }
                }
              }
            },
            "offset" : {
              "type" : "long"
            }
          }
        },
        "logrow" : {
          "type" : "integer"
        },
        "message" : {
          "type" : "text",
          "norms" : false
        },
        "pid" : {
          "type" : "integer"
        },
        "pihole" : {
          "type" : "ip"
        },
        "program" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "query_type" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "request_from" : {
          "type" : "ip"
        },
        "source_fqdn" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "source_host" : {
          "type" : "ip"
        },
        "source_port" : {
          "type" : "integer"
        },
        "tags" : {
          "type" : "keyword",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "type" : {
          "type" : "text",
          "norms" : false,
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        }
      }
    },
    "settings" : {
      "index" : {
        "lifecycle" : {
          "name" : "logstash-policy",
          "rollover_alias" : "logstash"
        },
        "refresh_interval" : "5s",
        "number_of_shards" : "1",
        "provided_name" : "logstash-syslog-dns-2019.06",
        "creation_date" : "1561414634004",
        "number_of_replicas" : "1",
        "uuid" : "f014sL1vQw2Rs4dM3UxxUw",
        "version" : {
          "created" : "7010199"
        }
      }
    }
  }
}

It seems that blocked_domain is a text field.

I let elk-hole run during the previous 24h hours and saw that ~ 20% of messages have the tag _grokparsefailure It may be possible that because of that, elastic does not get the messages that would create the good fields ? I will do my best to understand how the grok patterns works.

nin9s commented 5 years ago

happy to help out :)

please delete the present index template: DELETE /_template/logstash-syslog* than import this one: https://github.com/nin9s/elk-hole/blob/master/json/logstash-syslog-dns-index.template_ELK7.x_dev.json and recreate the index afterwards please.

Could you also have a look in the *logstash.log file so we can get a clue whats the cause for _grokparsefailure - there should be none!

are there more fields you had to manually change to/from .keyword besides blocked_domain and domain_request?

aviationfan commented 5 years ago

I suggest we make a new issue for grok parse failures because I get a LOT of them. What I did with logstash is have it write out the event to a file so I can look at them and figure out what patterns might be missing. Like this:

output {
  if "_grokparsefailure" in [tags]{
    csv {
      fields => ["message"]
      path => "/Users/jjwelch/grokparsefailures.csv"
    }
  }
  else if "pihole" in [tags]{
    elasticsearch {
      hosts => ["mac-pro.house:9200"]
      user => "elastic"
      password => "Service.1"
      manage_template => false
      index => "logstash-syslog-dns-%{+YYYY.MM}"
    }
  }
}

I would be glad to submit the contents of that file so we can figure out the grok stuff that needs to be added. i have done some myself in my conf file as well

nin9s commented 5 years ago

Show me what logstash is complaining about please I doubt there is something grok related missing as I have it working without _grokparsefailures and the github logstash.conf

auricom commented 5 years ago

Very nice ! Using the new provided template, the field "blocked_domain.keyword" now appears, and then when replacing "blocked_domain" with "blocked_domain.keyword" inside the "top piholed domains" vis, it now works.

The whole dashboard seems to be 100% usable. By the way my whole elastic stack have been updated to 7.2.0 and I do not find any regression.

Nice job !

nin9s commented 5 years ago

Do you still have any _grokparsefailures which you want to get parsed correctly @auricom

auricom commented 5 years ago

Unfortunately I still have thoses _grokparsefailures. I checked /var/log/logstash* but did not find anything relevant. I will modify logstash conf files to export _grokparsefailures in csv files like @aviationfan did and let the elk-hole run for some hours to get relevant traces. But we could continue on the other issue, as this one could now be closed :)