radekg / terraform-provisioner-ansible

Ansible with Terraform 0.14.x
Apache License 2.0
572 stars 100 forks source link

provisioner not working on GCP #149

Closed eric-aops closed 4 years ago

eric-aops commented 4 years ago

Steps to reproduce

I use terraform-provisioner-ansible in combination with AWS and it works flawless. But we also use GCP as a provider so I wanted to use it there too. Even though I can run 'remote-exec' commands on the 'private' node through the bastion connection, Ansible playbooks can't seem to connect.

Expected behavior

Run Ansible playbooks in the same way they perfectly work on AWS ...

Actual behavior

With the same configuration in place, this is the error that gets thrown:

PLAY [kubernetes] **************************************************************

TASK [Gathering Facts] *********************************************************
fatal: [10.80.8.13]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Warning: Permanently added '10.80.8.13' (ECDSA) to the list of known hosts.\r\neric_activeops_io@10.80.8.13: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).", "unreachable": true}

PLAY RECAP *********************************************************************
10.80.8.13                 : ok=0    changed=0    unreachable=1    failed=0    skipped=0    rescued=0    ignored=0   

...

Configuration

Terraform version: 0.12.18 terraform-provisioner-ansible version/SHA: v2.3.3 Terraform file / provisioner configuration:

  connection {
    host                = self.network_interface.0.network_ip
    type                = "ssh"
    user                = var.ansible_user
    private_key         = file("~/.ssh/google_compute_engine")
    bastion_host        = google_compute_instance.manager.network_interface.0.access_config.0.nat_ip
    bastion_user        = var.ansible_user
    bastion_private_key = file("~/.ssh/google_compute_engine")
  }

  provisioner "remote-exec" {
    inline = [
      "ls -la",
    ]
  }

provisioner "ansible" {
  plays {
    playbook {
      file_path      = "${path.module}/../../aops-ansible/playbooks/gcp/k8s-nodes.yml"
      roles_path     = ["${path.module}/../../aops-ansible/roles"]
      force_handlers = false
    }
    groups        = ["kubernetes"]
    become        = true
    become_method = "sudo"
    become_user   = "root"
  }
  ansible_ssh_settings {
    insecure_no_strict_host_key_checking = var.insecure_no_strict_host_key_checking
  }
}

So with the above connection setting the remote-exec works as expected, the Ansible provisioner throws the error.

Terraform run log:

eric-aops commented 4 years ago

Hi, Just found the solution to my problem. I needed to set ForwardAgent=yes in my ansible.cfg. But in doing so I'm seriously slowing down the configuration you are using.

Can you add a key to ansible_ssh_settings.go to be able to set this from within the provisioner?

Kind regards,

Eric

PS: just remembered I can overwrite with an inventory variable :-). Back to normal speed and it all works.