cloudposse / terraform-aws-efs-backup

Terraform module designed to easily backup EFS filesystems to S3 using DataPipeline
https://cloudposse.com/accelerate
Apache License 2.0
43 stars 33 forks source link
automatic aws backup cronjob datapipeline efs lambda nfs s3 scheduled-job snapshot terraform terraform-modules

Project Banner

Latest ReleaseLast UpdatedSlack Community

Terraform module designed to easily backup EFS filesystems to S3 using DataPipeline.

The workflow is simple:

[!TIP]

πŸ‘½ Use Atmos with Terraform

Cloud Posse uses atmos to easily orchestrate multiple environments using Terraform.
Works with Github Actions, Atlantis, or Spacelift.

Watch demo of using Atmos with Terraform
Example of running atmos to manage infrastructure from our Quick Start tutorial.

Usage

Include this module in your existing terraform code:

module "efs_backup" {
  source = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"

  name                               = "${var.name}"
  stage                              = "${var.stage}"
  namespace                          = "${var.namespace}"
  vpc_id                             = "${var.vpc_id}"
  efs_mount_target_id                = "${var.efs_mount_target_id}"
  use_ip_address                     = "false"
  noncurrent_version_expiration_days = "${var.noncurrent_version_expiration_days}"
  ssh_key_pair                       = "${var.ssh_key_pair}"
  datapipeline_config                = "${var.datapipeline_config}"
  modify_security_group              = "true"
}

output "efs_backup_security_group" {
  value = "${module.efs_backup.security_group_id}"
}

Integration with EFS

To enable connectivity between the DataPipeline instances and the EFS, use one of the following methods to configure Security Groups:

  1. Explicitly add the DataPipeline SG (the output of this module security_group_id) to the list of the ingress rules of the EFS SG. For example:
module "elastic_beanstalk_environment" {
  source     = "git::https://github.com/cloudposse/terraform-aws-elastic-beanstalk-environment.git?ref=master"
  namespace  = "${var.namespace}"
  name       = "${var.name}"
  stage      = "${var.stage}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("eb-env")))}"]
  tags       = "${var.tags}"

  # ..............................
}

module "efs" {
  source     = "git::https://github.com/cloudposse/terraform-aws-efs.git?ref=tmaster"
  namespace  = "${var.namespace}"
  name       = "${var.name}"
  stage      = "${var.stage}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("efs")))}"]
  tags       = "${var.tags}"

  # Allow EB/EC2 instances and DataPipeline instances to connect to the EFS
  security_groups = ["${module.elastic_beanstalk_environment.security_group_id}", "${module.efs_backup.security_group_id}"]
}

module "efs_backup" {
  source     = "git::https://github.com/cloudposse/terraform-aws-efs-backup.git?ref=master"
  name       = "${var.name}"
  stage      = "${var.stage}"
  namespace  = "${var.namespace}"
  delimiter  = "${var.delimiter}"
  attributes = ["${compact(concat(var.attributes, list("efs-backup")))}"]
  tags       = "${var.tags}"

  # Important to set it to `false` since we added the `DataPipeline` SG (output of the `efs_backup` module) to the `security_groups` of the `efs` module
  # See NOTE below for more information
  modify_security_group = "false"

  # ..............................
}
  1. Set modify_security_group attribute to true so the module will modify the EFS SG to allow the DataPipeline to connect to the EFS

NOTE: Do not mix these two methods together. Terraform does not support using a Security Group with in-line rules in conjunction with any Security Group Rule resources. https://www.terraform.io/docs/providers/aws/r/security_group_rule.html

NOTE on Security Groups and Security Group Rules: Terraform currently provides both a standalone Security Group Rule resource (a single ingress or egress rule), and a Security Group resource with ingress and egress rules defined in-line. At this time you cannot use a Security Group with in-line rules in conjunction with any Security Group Rule resources. Doing so will cause a conflict of rule settings and will overwrite rules.

[!IMPORTANT] In Cloud Posse's examples, we avoid pinning modules to specific versions to prevent discrepancies between the documentation and the latest released versions. However, for your own projects, we strongly advise pinning each module to the exact version you're using. This practice ensures the stability of your infrastructure. Additionally, we recommend implementing a systematic approach for updating versions to avoid unexpected changes.

Makefile Targets

Available targets:

  help                                Help screen
  help/all                            Display help for all targets
  help/short                          This help short screen
  lint                                Lint terraform code

Requirements

No requirements.

Providers

Name Version
aws n/a

Modules

Name Source Version
backups_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
datapipeline_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
logs_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
resource_role_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
role_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1
sns_label git::https://github.com/cloudposse/terraform-null-label.git tags/0.3.1

Resources

Name Type
aws_cloudformation_stack.datapipeline resource
aws_cloudformation_stack.sns resource
aws_iam_instance_profile.resource_role resource
aws_iam_role.resource_role resource
aws_iam_role.role resource
aws_iam_role_policy_attachment.resource_role resource
aws_iam_role_policy_attachment.role resource
aws_s3_bucket.backups resource
aws_s3_bucket.logs resource
aws_security_group.datapipeline resource
aws_security_group_rule.datapipeline_efs_ingress resource
aws_ami.amazon_linux data source
aws_efs_mount_target.default data source
aws_iam_policy_document.resource_role data source
aws_iam_policy_document.role data source
aws_region.default data source
aws_subnet_ids.default data source
aws_vpc.default data source

Inputs

Name Description Type Default Required
attributes Additional attributes (e.g. efs-backup) list(string) [] no
datapipeline_config DataPipeline configuration options map(string)
{
"email": "",
"instance_type": "t2.micro",
"period": "24 hours",
"timeout": "60 Minutes"
}
no
datapipeline_security_group Optionally specify a security group to use for the datapipeline instances string "" no
delimiter Delimiter to be used between name, namespace, stage, etc. string "-" no
efs_mount_target_id EFS Mount Target ID (e.g. fsmt-279bfc62) string n/a yes
modify_security_group Should the module modify the EFS security group string "false" no
name The Name of the application or solution (e.g. bastion or portal) any n/a yes
namespace Namespace (e.g. cp or cloudposse) any n/a yes
noncurrent_version_expiration_days S3 object versions expiration period (days) string "35" no
region (Optional) AWS Region. If not specified, will be derived from 'aws_region' data source string "" no
ssh_key_pair SSH key that will be deployed on DataPipeline's instance string n/a yes
stage Stage (e.g. prod, dev, staging) any n/a yes
subnet_id Optionally specify the subnet to use string "" no
tags Additional tags (e.g. map('BusinessUnit,XYZ) map(string) {} no
use_ip_address If set to true, will use IP address instead of DNS name to connect to the EFS string "false" no
vpc_id VPC ID string "" no

Outputs

Name Description
backups_bucket_name Backups bucket name
datapipeline_ids Datapipeline ids
logs_bucket_name Logs bucket name
security_group_id Security group id
sns_topic_arn Backup notification SNS topic ARN

Related Projects

Check out these related projects.

References

For additional context, refer to some of these links.

[!TIP]

Use Terraform Reference Architectures for AWS

Use Cloud Posse's ready-to-go terraform architecture blueprints for AWS to get up and running quickly.

βœ… We build it together with your team.
βœ… Your team owns everything.
βœ… 100% Open Source and backed by fanatical support.

Request Quote

πŸ“š Learn More
Cloud Posse is the leading [**DevOps Accelerator**](https://cpco.io/commercial-support?utm_source=github&utm_medium=readme&utm_campaign=cloudposse/terraform-aws-efs-backup&utm_content=commercial_support) for funded startups and enterprises. *Your team can operate like a pro today.* Ensure that your team succeeds by using Cloud Posse's proven process and turnkey blueprints. Plus, we stick around until you succeed. #### Day-0: Your Foundation for Success - **Reference Architecture.** You'll get everything you need from the ground up built using 100% infrastructure as code. - **Deployment Strategy.** Adopt a proven deployment strategy with GitHub Actions, enabling automated, repeatable, and reliable software releases. - **Site Reliability Engineering.** Gain total visibility into your applications and services with Datadog, ensuring high availability and performance. - **Security Baseline.** Establish a secure environment from the start, with built-in governance, accountability, and comprehensive audit logs, safeguarding your operations. - **GitOps.** Empower your team to manage infrastructure changes confidently and efficiently through Pull Requests, leveraging the full power of GitHub Actions. Request Quote #### Day-2: Your Operational Mastery - **Training.** Equip your team with the knowledge and skills to confidently manage the infrastructure, ensuring long-term success and self-sufficiency. - **Support.** Benefit from a seamless communication over Slack with our experts, ensuring you have the support you need, whenever you need it. - **Troubleshooting.** Access expert assistance to quickly resolve any operational challenges, minimizing downtime and maintaining business continuity. - **Code Reviews.** Enhance your team’s code quality with our expert feedback, fostering continuous improvement and collaboration. - **Bug Fixes.** Rely on our team to troubleshoot and resolve any issues, ensuring your systems run smoothly. - **Migration Assistance.** Accelerate your migration process with our dedicated support, minimizing disruption and speeding up time-to-value. - **Customer Workshops.** Engage with our team in weekly workshops, gaining insights and strategies to continuously improve and innovate. Request Quote

✨ Contributing

This project is under active development, and we encourage contributions from our community.

Many thanks to our outstanding contributors:

For πŸ› bug reports & feature requests, please use the issue tracker.

In general, PRs are welcome. We follow the typical "fork-and-pull" Git workflow.

  1. Review our Code of Conduct and Contributor Guidelines.
  2. Fork the repo on GitHub
  3. Clone the project to your own machine
  4. Commit changes to your own branch
  5. Push your work back up to your fork
  6. Submit a Pull Request so that we can review your changes

NOTE: Be sure to merge the latest changes from "upstream" before making a pull request!

🌎 Slack Community

Join our Open Source Community on Slack. It's FREE for everyone! Our "SweetOps" community is where you get to talk with others who share a similar vision for how to rollout and manage infrastructure. This is the best place to talk shop, ask questions, solicit feedback, and work together as a community to build totally sweet infrastructure.

πŸ“° Newsletter

Sign up for our newsletter and join 3,000+ DevOps engineers, CTOs, and founders who get insider access to the latest DevOps trends, so you can always stay in the know. Dropped straight into your Inbox every week β€” and usually a 5-minute read.

πŸ“† Office Hours

Join us every Wednesday via Zoom for your weekly dose of insider DevOps trends, AWS news and Terraform insights, all sourced from our SweetOps community, plus a live Q&A that you can’t find anywhere else. It's FREE for everyone!

License

License

Preamble to the Apache License, Version 2.0

Complete license is available in the [`LICENSE`](LICENSE) file. ```text Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to you under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at https://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ```

Trademarks

All other trademarks referenced herein are the property of their respective owners.


Copyright Β© 2017-2024 Cloud Posse, LLC

README footer

Beacon