thousandeyes / terraform-provider-thousandeyes

ThousandEyes Terraform Provider
Apache License 2.0
21 stars 26 forks source link

Error: Provider produced inconsistent final plan #138

Closed Jamie-Leon closed 10 months ago

Jamie-Leon commented 1 year ago

Receiving the error below when our Thousandeyes module has a dependancy on another module which is creating an EC2 instance with the agent setup in UserData

When setting up our tests referencing a data call to get find the agents.

`│ Error: Provider produced inconsistent final plan
│ 
│ When expanding the plan for module.thousandeyes-tests.thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-PRD-IRE[0] to include new values learned so far during apply, provider
│ "registry.terraform.io/thousandeyes/thousandeyes" produced an invalid new value for .agents: planned set element cty.ObjectVal(map[string]cty.Value{"agent_id":cty.UnknownVal(cty.Number),
│ "agent_name":cty.StringVal(""), "agent_state":cty.StringVal(""), "agent_type":cty.UnknownVal(cty.String),
│ "cluster_members":cty.ListValEmpty(cty.Object(map[string]cty.Type{"agent_state":cty.String, "ip_addresses":cty.List(cty.String), "last_seen":cty.String, "member_id":cty.Number,
│ "name":cty.String, "network":cty.String, "prefix":cty.String, "public_ip_addresses":cty.List(cty.String), "target_for_tests":cty.String, "utilization":cty.Number})),
│ "country_id":cty.StringVal(""), "created_date":cty.StringVal(""), "enabled":cty.NullVal(cty.Bool), "error_details":cty.ListValEmpty(cty.Object(map[string]cty.Type{"code":cty.String,
│ "description":cty.String})), "groups":cty.SetValEmpty(cty.Object(map[string]cty.Type{"builtin":cty.Bool, "group_id":cty.Number, "name":cty.String, "type":cty.String})),
│ "hostname":cty.StringVal(""), "ip_addresses":cty.UnknownVal(cty.List(cty.String)), "ipv6_policy":cty.StringVal(""), "keep_browser_cache":cty.NullVal(cty.Bool), "last_seen":cty.StringVal(""),
│ "location":cty.StringVal(""), "network":cty.StringVal(""), "prefix":cty.StringVal(""), "target_for_tests":cty.StringVal(""), "utilization":cty.NullVal(cty.Number),
│ "verify_ssl_certificate":cty.NullVal(cty.Bool)}) does not correlate with any element in actual.
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.`
Jamie-Leon commented 1 year ago

@pedro-te / @adchella-te / @tduzan-te / @te-ak / @pdx-te / @vutnguye-te / @nelson-te / @raul-te / @gaston-te

is this something you have come across before or have a method of getting around? This is causing us to have to comment out the module which deletes the tests which in turn breaks our ingestion in Elastic

Jamie-Leon commented 1 year ago

@JTBlanchard @william20111 @johntdyer adding yourselves too as it seems you are mentioned in the README

sfreitas-te commented 1 year ago

Hi @Jamie-Leon thanks for reporting we will verify and give you an update as soon as possible

Jamie-Leon commented 1 year ago

Hi @Jamie-Leon thanks for reporting we will verify and give you an update as soon as possible

@sfreitas-te is there anything I can provide to make this easier to solve?

Jamie-Leon commented 1 year ago

@sfreitas-te is it possible to get an ETA for this so I can update my team?

sfreitas-te commented 1 year ago

@Jamie-Leon we will provide an ETA early next week

pedro-te commented 1 year ago

Hi @Jamie-Leon ,

I'm sorry to read that you're facing some issues with our terraform provider. In order to better understand what's going on, we're going to need more details. Can you share with us the terraform code that you're using to reproduce this? We'll be able to provide an ETA once we have a better understanding of the issue.

Thank you, Pedro

Jamie-Leon commented 1 year ago

@pedro-te thanks for coming back to me, here is as much as I think is relevant for what you require.

The modules/thousandeyes-agent is just an EC2 spinning up and registering to Thousandeyes via userdata

Modules are as below

module "thousandeyes-agent-ew-1" {
  count             = var.feature-eu-west-1 > 0 ? 1 : 0
  source            = "../../modules/thousandeyes-agent"
  account-id        = var.account-id
  account-name      = var.account-name
  feature-tea-agent = var.feature-tea-agent
  region            = "eu-west-1"
  tags              = local.common-tags

  providers = {
    aws = aws.eu-west-1
  }
}

module "thousandeyes-agent-ew-2" {
  source            = "../../modules/thousandeyes-agent"
  account-id        = var.account-id
  account-name      = var.account-name
  feature-tea-agent = var.feature-tea-agent
  region            = "eu-west-2"
  tags              = local.common-tags

  providers = {
    aws = aws
  }
}

module "thousandeyes-tests" {
  source       = "../../modules/thousandeyes-tests"
  account-name = var.account-name

  depends_on = [module.thousandeyes-agent-ew-1, module.thousandeyes-agent-ew-2]

  providers = {
    thousandeyes = thousandeyes
  }
}

Tests are configured as below

# DEV TESTS

resource "thousandeyes_label" "Dev" {
  name  = "Cloud - AWS Dev"
  type  = "tests"
  count = var.account-name == "nbs-shared-dev" ? 1 : 0

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-NDC-1M-DEV-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-NDC-1M-DEV-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-STC-1M-DEV-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-STC-1M-DEV-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-DEV-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-DEV-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-STC-AWS-1M-DEV-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-STC-AWS-1M-DEV-IRE[count.index].test_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-NDC-1M-DEV-IRE" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-AWS-NDC-1M-DEV-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.dev-euw1.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-NDC-1M-DEV-LDN" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-AWS-NDC-1M-DEV-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.dev-euw2.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-STC-1M-DEV-IRE" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-AWS-STC-1M-DEV-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.dev-euw1.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-STC-1M-DEV-LDN" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-AWS-STC-1M-DEV-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.dev-euw2.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-NDC-AWS-1M-DEV-IRE" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-NDC-AWS-1M-DEV-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-ndc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-NDC-AWS-1M-DEV-LDN" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-NDC-AWS-1M-DEV-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-ndc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-STC-AWS-1M-DEV-IRE" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-STC-AWS-1M-DEV-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-stc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-STC-AWS-1M-DEV-LDN" {
  count                  = var.account-name == "nbs-shared-dev" ? 1 : 0
  test_name              = "N-A2S-STC-AWS-1M-DEV-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-stc-001.agent_id
  }

  provider = thousandeyes
}

# PROD TESTS

resource "thousandeyes_label" "Prod" {
  name  = "Cloud - AWS Prod"
  type  = "tests"
  count = var.account-name == "nbs-shared-prod" ? 1 : 0

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-NDC-1M-PRD-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-NDC-1M-PRD-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-STC-1M-PRD-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-AWS-STC-1M-PRD-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-PRD-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-PRD-IRE[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-STC-AWS-1M-PRD-LDN[count.index].test_id
  }

  tests {
    test_id = thousandeyes_agent_to_server.N-A2S-STC-AWS-1M-PRD-IRE[count.index].test_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-NDC-1M-PRD-IRE" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-AWS-NDC-1M-PRD-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.prod-euw1.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-NDC-1M-PRD-LDN" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-AWS-NDC-1M-PRD-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.prod-euw2.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-STC-1M-PRD-IRE" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-AWS-STC-1M-PRD-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.prod-euw1.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-AWS-STC-1M-PRD-LDN" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-AWS-STC-1M-PRD-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.prod-euw2.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-NDC-AWS-1M-PRD-IRE" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-NDC-AWS-1M-PRD-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-ndc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-NDC-AWS-1M-PRD-LDN" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-NDC-AWS-1M-PRD-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-ndc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-STC-AWS-1M-PRD-IRE" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-STC-AWS-1M-PRD-IRE"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-stc-001.agent_id
  }

  provider = thousandeyes
}

resource "thousandeyes_agent_to_server" "N-A2S-STC-AWS-1M-PRD-LDN" {
  count                  = var.account-name == "nbs-shared-prod" ? 1 : 0
  test_name              = "N-A2S-STC-AWS-1M-PRD-LDN"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "XXXXXX"
  port   = 49153

  agents {
    agent_id = data.thousandeyes_agent.nbs-h-tea-stc-001.agent_id
  }

  provider = thousandeyes
}

resource "time_sleep" "wait_2m" {
  create_duration = "2m"
  triggers = {
    always_run = "${timestamp()}"
  }
}
pedro-te commented 1 year ago

Hi @Jamie-Leon ,

Thank you for getting back to us with the terraform code. A few more questions:

I'm still not sure what's going on, but at first glance it seems that for some reason data.thousandeyes_agent.nbs-h-tea-ndc-001.agent_id contains some weird value that is causing that specific resource to not produce a valid plan. The error message references .agents specifically, so that's the only clue for now.

Thank you for your patience and for working with us on debugging this.

Cheers, Pedro

Jamie-Leon commented 1 year ago

@pedro-te Here are all the data resources

data "thousandeyes_agent" "dev-euw1" {
  agent_name = "nbs-shared-dev-eu-west-1-aws-thousandeyes-agent"

  depends_on = [time_sleep.wait_2m]
}

data "thousandeyes_agent" "dev-euw2" {
  agent_name = "nbs-shared-dev-eu-west-2-aws-thousandeyes-agent"

  depends_on = [time_sleep.wait_2m]
}

data "thousandeyes_agent" "nbs-h-tea-ndc-001" {
  agent_name = "nbs-h-tea-ndc-001"

  depends_on = [time_sleep.wait_2m]
}

data "thousandeyes_agent" "nbs-h-tea-stc-001" {
  agent_name = "nbs-h-tea-stc-001"

  depends_on = [time_sleep.wait_2m]
}

data "thousandeyes_agent" "prod-euw1" {
  agent_name = "nbs-shared-prod-eu-west-1-aws-thousandeyes-agent"

  depends_on = [time_sleep.wait_2m]
}

data "thousandeyes_agent" "prod-euw2" {
  agent_name = "nbs-shared-prod-eu-west-2-aws-thousandeyes-agent"

  depends_on = [time_sleep.wait_2m]
}

The agents are registered as this is reason I have put a 2 minutes sleep in.

If I remove the tests.thousandeyes_agent_to_server.N-A2S-NDC-AWS-1M-PRD-IRE[0] then it will show this is occurring for the next test.

pedro-te commented 1 year ago

Hi @Jamie-Leon ,

Thank you for your patience. I wasn't able to reproduce the issue locally. I don't have the contents of the thousandeyes-agent-ew-1 module to reproduce exactly what you did with the creation of the ThousandEyes agent in an EC2 instance, so I used a simple docker container.

This is a very basic example that tries to emulate what you're doing, I think:

main.tf

terraform {
  required_providers {
    thousandeyes = {
      source = "thousandeyes/thousandeyes"
      version = "~>2.0.1"
    }
  }
}

provider "thousandeyes" {
  token            = var.token
  account_group_id = var.account_group_id
}

module "thousandeyes-agent" {
  source = "./modules/agent"
}

module "thousandeyes-tests" {
  source = "./modules/tests"
  depends_on = [ module.thousandeyes-agent ]
}

./modules/agent:

terraform {
  required_providers {
    docker = {
      source = "kreuzwerker/docker"
      version = "~> 3.0.1"
    }
  }
}

resource "docker_image" "agent" {
  name         = "thousandeyes/enterprise-agent:latest"
  keep_locally = true
}

resource "docker_container" "agent" {
  image = docker_image.agent.image_id
  name  = "test-dynamic-agent"
  hostname = "test-dynamic-agent"
  memory = "2048"
  memory_swap = "4096"
  attach = false
  tty = true
  shm_size = 512
  env = ["TEAGENT_ACCOUNT_TOKEN=<token>", "TEAGENT_INET="]
  capabilities {
    add = ["NET_ADMIN", "SYS_ADMIN"]
  }
  command = ["/sbin/my_init"]
}

./modules/tests:

terraform {
  required_providers {
    thousandeyes = {
      source = "thousandeyes/thousandeyes"
      version = "~>2.0.1"
    }
  }
}

resource "time_sleep" "wait_2m" {
  create_duration = "2m"
  triggers = {
    always_run = "${timestamp()}"
  }
}

data "thousandeyes_agent" "test-dynamic-agent" {
  agent_name = "test-dynamic-agent"

  depends_on = [ time_sleep.wait_2m ]
}

resource "thousandeyes_agent_to_server" "github-issue-138" {
  count                  = 1
  test_name              = "test github issue 138"
  interval               = 60
  alerts_enabled         = false
  use_public_bgp         = false
  bandwidth_measurements = true
  network_measurements   = true

  server = "www.thousandeyes.com"
  port   = 443

  agents {
    agent_id = data.thousandeyes_agent.test-dynamic.agent_id
  }
}

The docker container with agent is created first, it registers with ThousandEyes successfully while the 2 minute timer runs. After the timer finishes, the data resource reads the agent correctly and the test is created:

Terraform will perform the following actions:

  # module.thousandeyes-agent.docker_container.agent will be created
  + resource "docker_container" "agent" {
      + attach                                      = false
      + bridge                                      = (known after apply)
      + command                                     = [
          + "/sbin/my_init",
        ]
      + container_logs                              = (known after apply)
      + container_read_refresh_timeout_milliseconds = 15000
      + entrypoint                                  = (known after apply)
      + env                                         = [
          + "TEAGENT_ACCOUNT_TOKEN=<token>",
          + "TEAGENT_INET=",
        ]
      + exit_code                                   = (known after apply)
      + hostname                                    = "test-dynamic-agent"
      + id                                          = (known after apply)
      + image                                       = (known after apply)
      + init                                        = (known after apply)
      + ipc_mode                                    = (known after apply)
      + log_driver                                  = (known after apply)
      + logs                                        = false
      + memory                                      = 2048
      + memory_swap                                 = 4096
      + must_run                                    = true
      + name                                        = "test-dynamic-agent"
      + network_data                                = (known after apply)
      + read_only                                   = false
      + remove_volumes                              = true
      + restart                                     = "no"
      + rm                                          = false
      + runtime                                     = (known after apply)
      + security_opts                               = (known after apply)
      + shm_size                                    = 512
      + start                                       = true
      + stdin_open                                  = false
      + stop_signal                                 = (known after apply)
      + stop_timeout                                = (known after apply)
      + tty                                         = true
      + wait                                        = false
      + wait_timeout                                = 60

      + capabilities {
          + add  = [
              + "NET_ADMIN",
              + "SYS_ADMIN",
            ]
          + drop = []
        }
    }

  # module.thousandeyes-agent.docker_image.agent will be created
  + resource "docker_image" "agent" {
      + id           = (known after apply)
      + image_id     = (known after apply)
      + keep_locally = true
      + name         = "thousandeyes/enterprise-agent:latest"
      + repo_digest  = (known after apply)
    }

  # module.thousandeyes-tests.data.thousandeyes_agent.test-dynamic-agent will be read during apply
  # (depends on a resource or a module with changes pending)
 <= data "thousandeyes_agent" "test-dynamic-agent" {
      + agent_id   = (known after apply)
      + agent_name = "test-dynamic-agent"
      + agent_type = (known after apply)
      + id         = (known after apply)
    }

  # module.thousandeyes-tests.thousandeyes_agent_to_server.github-issue-138[0] will be created
  + resource "thousandeyes_agent_to_server" "github-issue-138" {
      + alerts_enabled         = false
      + api_links              = (known after apply)
      + bandwidth_measurements = true
      + created_by             = (known after apply)
      + created_date           = (known after apply)
      + enabled                = true
      + groups                 = (known after apply)
      + id                     = (known after apply)
      + interval               = 60
      + live_share             = (known after apply)
      + modified_by            = (known after apply)
      + modified_date          = (known after apply)
      + network_measurements   = true
      + path_trace_mode        = "classic"
      + port                   = 443
      + probe_mode             = "AUTO"
      + protocol               = "TCP"
      + saved_event            = (known after apply)
      + server                 = "www.thousandeyes.com"
      + test_id                = (known after apply)
      + test_name              = "test github issue 138"
      + type                   = (known after apply)
      + use_public_bgp         = false

      + agents {
          + agent_id     = (known after apply)
          + agent_type   = (known after apply)
          + ip_addresses = (known after apply)
        }
    }

  # module.thousandeyes-tests.time_sleep.wait_2m will be created
  + resource "time_sleep" "wait_2m" {
      + create_duration = "2m"
      + id              = (known after apply)
      + triggers        = {
          + "always_run" = (known after apply)
        }
    }

Plan: 4 to add, 0 to change, 0 to destroy.

Do you want to perform these actions?
  Terraform will perform the actions described above.
  Only 'yes' will be accepted to approve.

  Enter a value: yes

module.thousandeyes-agent.docker_image.agent: Creating...
module.thousandeyes-agent.docker_image.agent: Creation complete after 0s [id=sha256:70eb238ec6a10487a4ad6c244607557ddd8bf91afdac5ce9d8d5cdabf64582dathousandeyes/enterprise-agent:latest]
module.thousandeyes-agent.docker_container.agent: Creating...
module.thousandeyes-agent.docker_container.agent: Creation complete after 0s [id=32d04ec73272a0aedf2ac25325665a68dd0568770992d3d67ad19043462e5ac7]
module.thousandeyes-tests.time_sleep.wait_2m: Creating...
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [10s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [20s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [30s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [40s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [50s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m0s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m10s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m20s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m30s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m40s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [1m50s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Still creating... [2m0s elapsed]
module.thousandeyes-tests.time_sleep.wait_2m: Creation complete after 2m0s [id=2023-07-03T13:22:52Z]
module.thousandeyes-tests.data.thousandeyes_agent.test-dynamic-agent: Reading...
module.thousandeyes-tests.data.thousandeyes_agent.test-dynamic-agent: Read complete after 2s [id=0x14000115e00]
module.thousandeyes-tests.thousandeyes_agent_to_server.github-issue-138[0]: Creating...
module.thousandeyes-tests.thousandeyes_agent_to_server.github-issue-138[0]: Creation complete after 1s [id=3832827]

What are the terraform and provider versions that you're using? Also, can you write a simple end to end example that triggers the issue you're seeing and share it with us so that we may reproduce it?

Thank you, Pedro

Jamie-Leon commented 1 year ago

@pedro-te is it possible to have a call with yourself at some point? I have an official case with ThousandEyes through our account - S-CS-0713875

It would most likely be easier to walk you through our code and share it that way?

ZahwaarHussain commented 1 year ago

Hi @pedro-te / @adchella-te / @tduzan-te / @te-ak / @pdx-te / @vutnguye-te / @nelson-te / @raul-te / @gaston-te

Are you able to provide an ETA for this matter now? Thanks

ZahwaarHussain commented 1 year ago

@sfreitas-te Is there an ETA for this at all? Thanks!

sfreitas-te commented 1 year ago

@ZahwaarHussain @Jamie-Leon I will schedule a call to see if we can fully understand the issue

pedro-te commented 1 year ago

Hi @ZahwaarHussain ,

I apologise for how long it took us to get back to you and for any inconvenience this issue might have caused you. We have released a new version of the provider which should address the issue you reported:

Can you give this new version a try and let us know, please?

Thank you, Pedro

ZahwaarHussain commented 1 year ago

@pedro-te Thankyou very much for looking into this. I've tested this afternoon and now no longer face the issue. Can close this one off Thanks!

pedro-te commented 1 year ago

@pedro-te Thankyou very much for looking into this. I've tested this afternoon and now no longer face the issue. Can close this one off Thanks!

Awesome! Glad to know. Thanks for the feedback @ZahwaarHussain

Cheers, Pedro