This script allows you to easily import various AWS log types into an Elasticsearch cluster running locally on your computer in a docker container.
The script configures everything that is needed in the ELK stack:
Elasticsearch:
Kibana:
Install Docker for Windows or Docker for Mac
Clone this git repository:
git clone https://github.com/mike-mosher/aws-la.git && cd aws-la
Install requirements:
pip install -r ./requirements.txt
Bring the docker environment up:
docker-compose up -d
Verify that the containers are running:
docker ps
Verify that Elasticsearch is running:
curl -XGET localhost:9200/_cluster/health?pretty
To run the script, specify the log type and directory containing the logs. For example, you could run the following command to import ELB Access Logs
python importLogs.py --logtype elb --logdir ~/logs/elblogs/
Valid log types are specified by running the --help
argument. Currently, the valid logtypes are the following:
elb # ELB access logs
alb # ALB access logs
vpc # VPC flow logs
r53 # Route53 query logs
apache # apache access log ('access_log')
apache_archives # apache access logs (gunzip compressed with logrotate)
Browse to the link provided in the output by using cmd + double-click
, or browse directly to the default Kibana page:
http://localhost:5601
You can import multiple log types in the same ELK cluster. Just run the command again with the new log type and log directory:
python importLogs.py --logtype vpc --logdir ~/logs/vpc-flowlogs/
When done, you can shutdown the containers:
docker-compose down -v
Python output:
Searching for traffic initiated by RFC1918 (private) IP addresses:
source_ip_address:"10.0.0.0/8" OR source_ip_address:"172.16.0.0/12" OR source_ip_address:"192.168.0.0/16"
NOT (source_ip_address:"10.0.0.0/8" OR source_ip_address:"172.16.0.0/12" OR source_ip_address:"192.168.0.0/16")
interface-id:<eni-name> AND (source_port:<port> OR dest_port:<port>)
Dashboard imported for VPC Flow Logs:
Dashboard imported for ALB Access Logs: