ansible-collections / cloud.terraform

The collection automates the management and provisioning of infrastructure as code (IaC) using the Terraform CLI tool within Ansible playbooks and Execution Environment runtimes.
GNU General Public License v3.0
99 stars 35 forks source link

feat request: Ansible Playbook constructs should equal Terraform Template constructs #26

Open sean-freeman opened 1 year ago

sean-freeman commented 1 year ago
SUMMARY

At present, the Ansible Module terraform should be renamed as terraform_template_apply which more appropriately describes the current capabilities. The Ansible Module within this Ansible Collection is a static execution of a pre-defined Terraform Template, it does not allow for any dynamic control by Ansible

feat request: Ansible Playbook constructs should equal Terraform Template constructs

ISSUE TYPE
ADDITIONAL INFORMATION

By each Ansible Task acting as a construct for a Terraform Module, the Ansible Playbook achieves a level of parity with a Terraform Template. This allows Ansible to programatically/dynamically alter the desired infrastructure state being provisioned by Terraform.

By way of example, an Ansible Task might request count of a Terraform Module to provision equally OR might loop to re-use the same Terraform Module but executed each time with different inputs.

It is not currently possible to treat each Ansible Task (within the Ansible Playbook) as if it were a Terraform Module Block call (within the Terraform Template).

See below as pseudocode example....

Terraform Template high-level design:

# Terraform Providers declaration
terraform {
  required_version = ">= 1.0"
  required_providers {
    aws = {
      source  = "hashicorp/aws" # Obtain the Terraform Provider from the Terraform Registry
      version = "~> 4.0"
    }
  }
}

# Terraform Providers configuration
provider "aws" {
  region = "us-east-1"
  access_key = var.tf_template_input_var_aws_credential_key_access
  secret_key = var.tf_template_input_var_aws_credential_key_secret
}

module "run_aws_vpc_create" {
  source  = "terraform-aws-modules/vpc/aws" # Obtain the Terraform Module from the Terraform Registry
#  source = "github.com/terraform-aws-modules/terraform-aws-vpc?ref=main" # From GitHub repository
  terraform_module_variable1  =   "static"
}

module "run_aws_ec2_instance_create" {
  source  = "terraform-aws-modules/ec2-instance/aws" # Obtain the Terraform Module from the Terraform Registry
  terraform_module_variable2  =   module.run_aws_vpc_create.expected_output_from_module1
}

variable "tf_template_input_var_aws_credential_key_access" {
  description = "Text"
}

variable "tf_template_input_var_aws_credential_key_secret" {
  description = "Text"
}

Ansible Playbook for executing Terraform Modules high-level proposed design:

---
- name: "Ansible Playbook to provision infrastructure using Terraform"
  hosts: local
  gather_facts: false

  vars_prompt:
    - name: terraform_template_input1
      prompt: Please enter value
      private: no
    - name: aws_credential_key_access
      prompt: Please enter value
      private: no
    - name: aws_credential_key_secret
      prompt: Please enter value
      private: no

  tasks:

  - name: Terraform Template generate
    cloud.terraform.terraform_template_generate:
      project_path: "{{ playbook_dir }}/tf_template"
      terraform_version: ">= 1.0"
      provider_upgrade: true # terraform init will use `-upgrade` flag
      providers:
        - name: hashicorp/aws
          version: "~> 4.0"
          arguments:
            - region: "us-east-1"
            - access_key: "var.tf_template_input_var_aws_credential_key_access"
            - secret_key: "var.tf_template_input_var_aws_credential_key_secret"
      modules:
        - title: "run_aws_vpc_create"
          source: "terraform-aws-modules/vpc/aws" # Obtain the Terraform Module from the Terraform Registry
          arguments:
            - terraform_module_variable1: "{{ terraform_template_input1 }}"
        - title: "run_aws_ec2_instance_create"
          source: "terraform-aws-modules/ec2-instance/aws" # Obtain the Terraform Module from the Terraform Registry
          arguments:
            - terraform_module_variable2: "module.run_aws_vpc_create.expected_output_from_module1"
      input_variables:
        - name: tf_template_input_var_aws_credential_key_access
          desc: "Text"
        - name: tf_template_input_var_aws_credential_key_secret
          desc: "Text"
      state: present

  - name: Terraform init and Terraform apply - Module 1
    cloud.terraform.terraform_template_apply:
      project_path: "{{ playbook_dir }}/tf_template"
      state: present
      input_variables:
        tf_template_input_var_aws_credential_key_access: "{{ aws_credential_key_access }}"
        tf_template_input_var_aws_credential_key_secret: "{{ aws_credential_key_secret }}"
sean-freeman commented 1 year ago

Workaround, download Terraform Modules from GitHub, add Terraform Provider declaration file and execute

This requires downloading the Terraform Module from GitHub source code instead of the versions released via the Terraform Registry; no difference in the code but not following Terraform best practices.

Thereafter each Terraform Module is executed per Ansible Task, and JSON output from Ansible Task for Terraform Module 1 will be passed using Ansible to the Ansible Task for Terraform Module 2. This means individual Terraform state files, and Terraform will be unaware of the relationship between each Terraform Module and unable to resolve dependencies in the future.

NOTE: Use static region definition, workaround does not work with variable insertion into AWS Availability Zones due to Ansible Module for Terraform performing incorrect parsing, see Issue #27 ..... azs: '["{{ ansible_var_aws_region }}a", "{{ ansible_var_aws_region }}b", "{{ ansible_var_aws_region }}c"]' results in -var 'azs=['\"'\"'us-west-2a'\"'\"', '\"'\"'us-west-2b'\"'\"', '\"'\"'us-west-2c'\"'\"']' instead of -var 'azs=[\"us-west-2a\", \"us-west-2b\", \"us-west-2c\"]'

Workaround code:

---

- name: "Ansible Playbook"
  hosts: localhost
  gather_facts: false

  vars_prompt:
    - name: ansible_var_aws_access_key
      prompt: Please enter AWS Access Key
      private: no
    - name: ansible_var_aws_secret_access_key
      prompt: Please enter AWS Secret Access Key
      private: yes
    - name: ansible_var_aws_region
      prompt: Please enter AWS Region
      private: no
    - name: ansible_var_aws_resources_prefix
      prompt: Please enter prefix for AWS resources
      private: no

  tasks:

  - name: Terraform Module for AWS VPC - Git clone
    ansible.builtin.git:
      repo: 'https://github.com/terraform-aws-modules/terraform-aws-vpc.git'
      dest: "{{ playbook_dir }}/tmp/terraform-aws-vpc"
      version: master

  # Providers within Terraform Modules - https://developer.hashicorp.com/terraform/language/modules/develop/providers
  - name: Terraform Module for AWS VPC - Create Terraform providers information inside of each Terraform Module
    ansible.builtin.copy:
      dest: "{{ playbook_dir }}/tmp/terraform-aws-vpc/tf_provider.tf"
      content: |
        # Terraform Provider declaration
        provider "aws" {
        }

  - name: Terraform Module for AWS VPC - Terraform init and Terraform apply
    register: terraform_result
    environment:
      AWS_ACCESS_KEY_ID: "{{ ansible_var_aws_access_key }}"
      AWS_SECRET_ACCESS_KEY: "{{ ansible_var_aws_secret_access_key }}"
      AWS_REGION: "{{ ansible_var_aws_region }}"
    cloud.terraform.terraform:
      project_path: "{{ playbook_dir }}/tmp/terraform-aws-vpc"
      state: present
      force_init: true
      variables:
        name: "{{ ansible_var_aws_resources_prefix }}"
        cidr: "10.0.0.0/16"
        azs: '["us-west-2a", "us-west-2b", "us-west-2c"]'
        private_subnets: '["10.0.1.0/24", "10.0.2.0/24", "10.0.3.0/24"]'
        public_subnets: '["10.0.11.0/24", "10.0.12.0/24", "10.0.13.0/24"]'
        database_subnets: '["10.0.21.0/24", "10.0.22.0/24", "10.0.23.0/24"]'
        elasticache_subnets: '["10.0.31.0/24", "10.0.32.0/24", "10.0.33.0/24"]'
        redshift_subnets: '["10.0.41.0/24", "10.0.42.0/24", "10.0.43.0/24"]'
        intra_subnets: '["10.0.51.0/24", "10.0.52.0/24", "10.0.53.0/24"]'
        private_subnet_names: '["Private Subnet One", "Private Subnet Two"]'
        database_subnet_names: '["DB Subnet One"]'
        elasticache_subnet_names: '["Elasticache Subnet One", "Elasticache Subnet Two"]'
        redshift_subnet_names: '["Redshift Subnet One", "Redshift Subnet Two", "Redshift Subnet Three"]'
        intra_subnet_names: []
        create_database_subnet_group: 'false'
        manage_default_network_acl: 'true'
        default_network_acl_tags: '{ Name = "{{ ansible_var_aws_resources_prefix }}-default" }'
        manage_default_route_table: 'true'
        default_route_table_tags: '{ Name = "{{ ansible_var_aws_resources_prefix }}-default" }'
        manage_default_security_group: 'true'
        default_security_group_tags: '{ Name = "{{ ansible_var_aws_resources_prefix }}-default" }'
        enable_dns_hostnames: 'true'
        enable_dns_support: 'true'
        enable_nat_gateway: 'true'
        single_nat_gateway: 'true'
        customer_gateways: | 
          {
            IP1 = {
              bgp_asn     = 65112
              ip_address  = "1.2.3.4"
              device_name = "some_name"
            },
            IP2 = {
              bgp_asn    = 65112
              ip_address = "5.6.7.8"
            }
          }
        enable_vpn_gateway: 'true'
        enable_dhcp_options: 'true'
        dhcp_options_domain_name: "service.consul"
        dhcp_options_domain_name_servers: '["127.0.0.1", "10.10.0.2"]'
        enable_flow_log: 'true'
        create_flow_log_cloudwatch_log_group: 'true'
        create_flow_log_cloudwatch_iam_role: 'true'
        flow_log_max_aggregation_interval: 60

Workaround addendum, Terraform Modules with sensitive outputs

By design, any Terraform Module will include output declarations to provide data from itself and up towards the "parent runtime" root Terraform Template. These output may have sensitive data, for example creating SSH Keys using the Terraform Resource tls_private_key as a single example.

For more information, see documentation: https://developer.hashicorp.com/terraform/language/values/outputs#sensitive-suppressing-values-in-cli-output

When an Ansible Task attempts to execute a Terraform Module with one of these sensitive output, it will fail with the (expected by design) following Terraform error:

{
    "changed": false,
    "msg": "
Terraform plan could not be created
STDERR: 
Error: Output refers to sensitive values

  on outputs.tf line 35:
  35: output \"output_goes_here\" {

To reduce the risk of accidentally exporting sensitive data that was intended
to be only internal, Terraform requires that any root module output
containing sensitive data be explicitly marked as sensitive, to confirm your
intent.

If you do intend to export this data, annotate the output value as sensitive
by adding the following argument:
    sensitive = true

COMMAND: /usr/local/bin/terraform plan -lock=true -input=false
-no-color -detailed-exitcode
....
....
"
}

The only crude workaround to this, would be to on-the-fly alter all output declarations for the known Terraform Resource that is creating the error, for example:


  tasks:

  ....
  ....

  - name: Terraform Module for AWS VPC - Find any subdirectory
    ansible.builtin.find:
      paths: "{{ playbook_dir }}/tmp/terraform-aws-vpc"
      file_type: directory
    register: searched_aws_vpc_module_directories

  - name: Terraform Module for AWS VPC - Find all output files
    ansible.builtin.find:
      paths: "{{ item.path }}/"
      patterns: 'outputs.tf'
      use_regex: true
      file_type: file
    register: searched_aws_vpc_module_files_outputs
    loop: "{{ searched_aws_vpc_module_directories.files }}"

  # Workaround for...
  # "To reduce the risk of accidentally exporting sensitive data that was intended to be only internal,
  #  Terraform requires that any root module output containing sensitive data be explicitly marked as sensitive, to confirm your intent."
  - name: Terraform Module for AWS VPC - Workaround for Terraform Outputs that are sensitive
    ansible.builtin.shell: awk '$0 ~ /private_key_pem/ { print "  " $1 " " $2 " nonsensitive(" $3 ")";getline}1' {{ item.files[0].path }} > $PWD/tmp.tf && mv $PWD/tmp.tf {{ item.files[0].path }}
#    ansible.builtin.shell: awk '$0 ~ /private_key_pem/ { print $1;$0="  sensitive = false\n}" }1' {{ item.files[0].path }} > $PWD/tmp.tf && mv $PWD/tmp.tf {{ item.files[0].path }}
    loop: "{{ searched_aws_vpc_module_files_outputs.results }}"
    when: item.files | length > 0
tima commented 1 year ago

Thanks for your effort on this @sean-freeman. What you are describing is something we considered while scoping out this collection. In the end we deemed it out of scope and stand by that decision. There were a number of reasons contributing to our decision.

The focus of this offering was to support users who had existing Terraform plans they wanted to integrate into the Ansible workflows. We modeled this solution to be consistent with how we approach other cloud provisioning tools such as AWS CloudFormation.

For doing some dynamic cloud provisioning we recommend using Ansible itself starting with collections such as amazon.aws rather than trying to dynamically generate a valid plan for Terraform and hope it works as intended. We are not retiring native cloud provisioning and management in Ansible with the publishing of this collection. In fact, we are committed to doing more to enhance and improve what is offered.

If you really wanted to do something dynamic with native Terraform functions, an easier and safer means of dynamism driven by Ansible using Terraform is to dynamically generate Terraform variables using the template module and then apply your plan. I realize there are limits to that approach which leads me to another consideration.

What I would recommend avoiding is complicated integration and interactions in whatever you dynamically generate for Terraform. In our experience, the stability and maintenance of an implementation like that will suffer without providing adequate value for the burden and risk.

This is not something we intend to put into the project roadmap, but I'm going to leave this issue open for now.