michaelkedey / aws_beanstalk

This repo contains code to deploy an ASP. Net application on AWS Beanstalk. The infrastructure is defined in Terraform (HCL).
0 stars 0 forks source link
aws beanstalk cicd cloudfront dotnet-core github-actions lambda-functions s3-bucket

Manulife Insurance AWS Infrastructure with Elastic Beanstalk, Lambda, Route 53 and S3.

Overview

This repository contains Terraform configurations to provision AWS infrastructure consisting of Elastic Beanstalk environment, a Lambda function for automatic deployments, Route 53 for DNS management and an S3 bucket for object storage.

To run this infrustracture locally configure the prerequisite:

Prerequisites

Getting Started

  1. Create a new working Directory and change into it:

    mkdir <working_dir> && cd <working_dir>
  2. Clone this repository:

    git clone <https://github.com/michaelkedey/aws_beanstalk.git>
  3. Change into the project repo:

    cd <aws_beanstalk/src/infrastracture>
  4. Run the format script to format all Terraform files:

    ./format_validate_all.sh
  5. Review and customize the variables in .terraform.tfvars file inside the `src/infrastracture/env/`** drectories, with your specific configuration details.

  6. Review and customize the backend in backend.tfvars file inside the `src/infrastracture/env/`** directories, with your specific configuration details.

  7. Initialize Terraform:

    terraform init -var-file=<"./env/**/.terraform.tfvars"> -backend-config=<"./env/**/backend.tfvars">
  8. Plan Terraform:

    terraform plan -var-file=<"./env/**/.terraform.tfvars">
    
  9. Apply the Terraform configuration:

    terraform apply -var-file=<"./env/**/.terraform.tfvars"> --auto-approve

    Follow the prompts to confirm the changes.

    Note: it may be necesary to know you have to run thw apply command if the first one fails.

  10. Destroy the resources after testing:

    terraform destroy -var-file=<"./env/**/.terraform.tfvars"> --auto-approve

Infrastructure Components

Infra-Design Architecture

VPC

S3 Bucket

Elastic Beanstalk Environment

Lambda Function

Elastic Load Balancer

An elastic load balancer, security group, target groups and listeners for the various services and ports has been provided for. The elastic load balancer distributes incoming traffic across multiple instances of your web server, such as ec2 instances as defined in the beanstalk environment. However, depending on your use case, you can associate this load balancer with the beanstalk, or allow beanstalk create it's own load balancer as has been configured.

Route 53

This Infrastracture Includes

Usage

CI/CD - Infrastructure Pipeline

The CI/CD pipeline uses GitHub Actions defined in yaml files contained in the .github/workflows directory. There are three yaml files for the infrastructure CI/CD. One file for each stage of deployment i.e. development (dev_actions.yaml file), staging (staging_actions.yaml file) and production (prod_actions.yaml file). The dev level is triggered by commit to branch main with paths 'src/infrastructure/**'. The staging level deployment is triggered by a successful completion of the dev_actions.yamlworkflow. The production level is triggered manually and requires manual approval. This is to ensure that production level will be triggered after development and staging meet expectations.

CI/CD - Application Pipeline

The CI/CD application pipeline use GitHub Actions also defined in yaml files contained in the .github/workflows directory. There are three yaml files for the application CI/CD. One file for each stage of deployment i.e. development (dev_s3.yaml file), staging (staging_s3.yaml file) and production (prod_s3.yaml file). The dev level is triggered by commit to branch s3_auto with paths 'src/s3_auto_uploads/**'. The staging level deployment is triggered by a successful completion of the dev_s3.yamlworkflow. The production level is triggered manually and requires manual approval so as to ensure that production level will be triggered after development and staging meet expectations.

The yaml files contain commands that sync s3_auto_uploads directory with the respective buckets of the various deployment stages.

The sync causes an upload into the S3 buckets which triggers the lambda if the objects so uploaded satisfy the prefix and suffix criteria. The triggered lambda copies the application update and creates and deploys an application version on Elastic Beanstalk.

Cleanup

Three destroy yaml files are contained in the .github/workflows directory. These include dev_destroy.yaml, staging_destroy.yaml and prod_destroy.yamlfiles. They require manual approval to trigger the destroy workflow for the respective deployment levels.

Directory Structure

$ tree
.
|-- Manulife_Auto_IaC_Design_Architecture.png
|-- README.md
`-- src
    |-- app_versions
    |   `-- LambdaWebApp2.zip
    `-- infrastracture       
        |-- env
        |   |-- dev
        |   |   `-- backend.tfvars
        |   |-- prod
        |   |   `-- backend.tfvars
        |   `-- staging
        |       `-- backend.tfvars
        |-- format_validate_all.sh
        |-- locals.tf
        |-- main.tf
        |-- modules
        |   |-- app
        |   |   |-- app.tf
        |   |   |-- outputs.tf
        |   |   `-- variables.tf
        |   |-- app_version
        |   |   |-- app_version.tf
        |   |   |-- outputs.tf
        |   |   |-- store.tf
        |   |   `-- variables.tf
        |   |-- beanstalk
        |   |   `-- prod
        |   |       |-- beanstalk-ec2-policy.json
        |   |       |-- beanstalk-service-policy.json
        |   |       |-- beanstalk.tf
        |   |       |-- outputs.tf
        |   |       `-- variables.tf
        |   |-- dir_upload
        |   |   |-- uploads.tf
        |   |   `-- variables.tf
        |   |-- file_upload
        |   |   |-- outputs.tf
        |   |   |-- uploads.tf
        |   |   `-- variables.tf
        |   |-- route53
        |   |   |-- output.tf
        |   |   |-- route.tf
        |   |   `-- variables.tf
        |   `-- vpc
        |       |-- outputs.tf
        |       |-- providers.tf
        |       |-- store.tf
        |       |-- variables.tf
        |       `-- vpc.tf
        |-- outputs.tf
        |-- providers.tf
        |-- s3_uploads
        |   |-- dotnet-linux.zip
        |   `-- dotnet-linux2.zip
        `-- variables.tf

17 directories, 39 files

Notes

Contributing

Feel free to contribute to this project by opening issues or creating pull requests.

Contributors