ansible / terraform-provider-ansible

community terraform provider for ansible
https://registry.terraform.io/providers/ansible/ansible/latest
GNU General Public License v3.0
201 stars 45 forks source link

[BUG] ansible_playbook can't handle extra_vars changing #98

Open irishgordo opened 8 months ago

irishgordo commented 8 months ago

Hi all, this project is awesome! Thank you so much for the v1.2.0 release recently! I have been running into a bit of an issue on the v1.1.0 release and haven't updated it, but believe it still may persist.

Description When provisioning an ansible_playbook TF resource, with some replayability and the underlying extra_vars change, the ansible_playbook fails to re-run.


╷
│ Error: Provider produced inconsistent final plan
│ 
│ When expanding the plan for ansible_playbook.fileserver-vm-ansible-playbook to include new values learned so far during apply, provider
│ "registry.terraform.io/ansible/ansible" produced an invalid new value for .extra_vars["fileserver_server_ip"]: was cty.StringVal("192.168.104.62"),
│ but now cty.StringVal("fe80::ff:fe94:52c3").
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.
╵
╷
│ Error: Provider produced inconsistent final plan
│ 
│ When expanding the plan for ansible_playbook.fileserver-vm-ansible-playbook to include new values learned so far during apply, provider
│ "registry.terraform.io/ansible/ansible" produced an invalid new value for .name: inconsistent values for sensitive attribute.
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.

Reproduce

Have some extra_vars defined. Provision, w/ replayability set to true. Change those extra_vars. Re-provision, w/ replayability set to true.

Workaround

I've found that, "re-running" it after it fails, fixes it. But it always seems to fail first.

irishgordo commented 8 months ago

Noting that it has happened on v1.2.0 too:

2024-03-08T19:29:16.226-0800 [DEBUG] provider: plugin process exited: path=.terraform/providers/registry.terraform.io/ansible/ansible/1.2.0/linux_amd64/terraform-provider-ansible_v1.2.0 pid=1233802
gravesm commented 7 months ago

@irishgordo Can you provide a minimal terraform config to reproduce this? I am able to change extra_vars on subsequent applies without error.

irishgordo commented 4 months ago

@gravesm so sorry for the late reply! This absolutely slipped through the cracks.

On the extra_vars portion, they're shifting but that's due to also the playbook shifting.

╷
│ Error: Provider produced inconsistent final plan
│ 
│ When expanding the plan for ansible_playbook.special-vm-ansible-playbook to include new values learned so far during apply, provider "registry.terraform.io/ansible/ansible" produced an invalid
│ new value for .name: inconsistent values for sensitive attribute.
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.
╵
╷
│ Error: Provider produced inconsistent final plan
│ 
│ When expanding the plan for ansible_playbook.special-vm-ansible-playbook to include new values learned so far during apply, provider "registry.terraform.io/ansible/ansible" produced an invalid
│ new value for .extra_vars["vm_server_ip"]: was cty.StringVal("A.B.C.D"), but now cty.StringVal("A.B.C.E").
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.
╵
2024-07-17T17:55:10.414-0700 [DEBUG] provider.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = error reading from server: EOF"
2024-07-17T17:55:10.415-0700 [INFO]  provider: plugin process exited: plugin=.terraform/providers/registry.terraform.io/ansible/ansible/1.3.0/linux_amd64/terraform-provider-ansible_v1.3.0 id=899186

( I've slightly modified the message )

I think the gist is (on v1.3.0), that given something where we're leveraging a ternary based local variable in:


locals {
  playbook-to-use = var.PRE_AIRGAP == true ? "ansible/pre-airgap.yaml" : "ansible/airgap.yaml"
}

resource "ansible_playbook" "vm-ansible-playbook" {
  playbook = local.playbook-to-use
  # Inventory configuration
  name = "${var.VM_NAME} ansible_password=${var.VM_PASSWORD} ansible_host=${var.IP_ADDRESS} ansible_sudo_pass=${var.VM_PASSWORD} ansible_ssh_user=ubuntu ansible_ssh_common_args='-o ControlMaster=auto -o ControlPersist=60s -o UserKnownHostsFile=/dev/null' ansible_ssh_extra_args='-o StrictHostKeyChecking=no'"

  check_mode              = false
  diff_mode               = false
  ignore_playbook_failure = false

  replayable = true
  verbosity  = 6
}

And then calling the Terraform to build out the resource like:

TF_LOG=TRACE terraform apply -var='PRE_AIRGAP=true' -auto-approve

Then after the Anisble playbook for in this use case "pre-airgap" is applied, so all network dependant items/configuration is set up with Terraform Ansible.

After all that is set-up, the VM IP Address will have changed -> but it could still be interpolated with like var.VM_PASSWORD -> but then we just call:

TF_LOG=TRACE terraform apply -var='PRE_AIRGAP=false' -auto-approve

Which then is changing the underlying "playbook" property in the ansible_playbook resource, since it's pointing to now our "airgap" Ansible we want to run.

When we run that, what I notice, is that the first "iteration" fails. And it throws the "inconsistent plan" output.

But then re-running it again:

TF_LOG=TRACE terraform apply -var='PRE_AIRGAP=false' -auto-approve

It does succeed... it just like "fails" on the first time we change the underlying resource's playbook.

So currently, I'm working to implement a solution where we'll just go ahead and have like a count, catching the failure the first time, then retrying, and if it fails a second time, that's an issue, but it "should" :tm: :laughing: -- succeed.

Maybe this is an anti-pattern in Terraform, with the Ansible resource, but I just couldn't think of another way to really accomplish what it is that we were trying to do with the overall series of ~10 integrations that needed to be pre-airgapped, then airgapped.

Again this is a phenomenal tool and thank you a ton for your help on the project! The v1.3.0 release was awesome to see too :+1: :smile: !