scottwinkler / terraform-provider-shell

Terraform provider for executing shell commands and saving output to state file
Mozilla Public License 2.0
279 stars 61 forks source link

Json string detection picking nested structures so state missing data #32

Closed lawrencegripper closed 4 years ago

lawrencegripper commented 4 years ago

Hi,

First up, awesome work on the provider it's super useful!

I've got a scenario where the read is returning a complex object like the below. It looks like the json string detection picks the last valid json string so you end up with just {"spark.speculation": true} in the state file.

Not sure if this is me doing something wrong, will carry on playing.

{
  "last_state_loss_time": 0,
  "spark_version": "5.3.x-scala2.11",
  "azure_attributes": {},
  "state": "PENDING",
  "enable_elastic_disk": true,
  "init_scripts_safe_mode": false,
  "num_workers": 1,
  "driver_node_type_id": "Standard_D3_v2",
  "default_tags": {
    "Creator": "lagripp@microsoft.com",
    "ClusterName": "my-cluster",
    "ClusterId": "0327-174802-howdy690",
    "Vendor": "Databricks"
  },
  "creator_user_name": "lagripp@microsoft.com",
  "cluster_id": "0327-174802-howdy690",
  "cluster_name": "my-cluster",
  "node_type_id": "Standard_D3_v2",
  "state_message": "Finding instances for new nodes, acquiring more instances if necessary",
  "enable_local_disk_encryption": false,
  "autotermination_minutes": 0,
  "cluster_source": "API",
  "start_time": 1585331282783,
  "spark_conf": {
    "spark.speculation": "true"
  }
}

Full logs

DEBUG] shell script going to execute: /bin/sh -c
2020-03-27T17:48:04.198Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:04    pwsh ./scripts/read.ps1
2020-03-27T17:48:04.198Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:04 -------------------------
2020-03-27T17:48:04.198Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:04 [DEBUG] Starting execution...
2020-03-27T17:48:04.198Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:04 -------------------------
2020-03-27T17:48:05.331Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05   {   "last_state_loss_time": 0,   "spark_version": "5.3.x-scala2.11",   "azure_attributes": {},   "state": "PENDING",   "enable_elastic_disk": true,   "init_scripts_safe_mode": false,   "num_workers": 1,   "driver_node_type_id": "Standard_D3_v2",   "default_tags": {     "Creator": "lagripp@microsoft.com",     "ClusterName": "my-cluster",     "ClusterId": "0327-174802-howdy690",     "Vendor": "Databricks"   },   "creator_user_name": "lagripp@microsoft.com",   "cluster_id": "0327-174802-howdy690",   "cluster_name": "my-cluster",   "node_type_id": "Standard_D3_v2",   "state_message": "Finding instances for new nodes, acquiring more instances if necessary",   "enable_local_disk_encryption": false,   "autotermination_minutes": 0,   "cluster_source": "API",   "start_time": 1585331282783,   "spark_conf": {     "spark.speculation": "true"   } }
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05 -------------------------
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05 [DEBUG] Command execution completed:
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05 -------------------------
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05 [DEBUG] JSON strings found: 
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: [{   "last_state_loss_time": 0,   "spark_version": "5.3.x-scala2.11",   "azure_attributes": {} {     "Creator": "lagripp@microsoft.com",     "ClusterName": "my-cluster",     "ClusterId": "0327-174802-howdy690",     "Vendor": "Databricks"   } {     "spark.speculation": "true"   }]
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell: 2020/03/27 17:48:05 [DEBUG] Valid map[string]string:
2020-03-27T17:48:05.337Z [DEBUG] plugin.terraform-provider-shell:  map[spark.speculation:true]

State file

"instances": [
        {
          "schema_version": 0,
          "attributes": {
            "dirty": false,
            "environment": {
              "DATABRICKS_HOST": "https://REMOVED.azuredatabricks.net",
              "DATABRICKS_TOKEN": "REMOVED",
              "machine_sku": "Standard_D3_v2",
              "worker_nodes": "8"
            },
            "id": "bpv43aok1sip412ndec0",
            "lifecycle_commands": [
              {
                "create": "pwsh ./scripts/create.ps1",
                "delete": "pwsh ./scripts/delete.ps1",
                "read": "pwsh ./scripts/read.ps1",
                "update": "pwsh ./scripts/update.ps1"
              }
            ],
            "output": {
              "spark.speculation": "true"
            },
            "triggers": null,
            "working_directory": "."
          },
          "private": "bnVsbA=="
        }
      ]
scottwinkler commented 4 years ago

Oh yeah thats a bug. Thanks for filing this issue. I will have to think about how to fix this. In the meantime I would recommend using jq or something to create a new JSON object to avoid this problem

scottwinkler commented 4 years ago

@lawrencegripper I added support for complex nested JSON. I also fixed the weird bug you were getting. Let me know what you think.

lawrencegripper commented 4 years ago

Awesome, thanks - I got things playing nice with the jq workarond and I'll try and give the PR a spin today.

scottwinkler commented 4 years ago

Released as part of v1.1.0. If there is a bug please file a new issue.

lawrencegripper commented 4 years ago

Thanks, sorry about not checking yesterday my days are a bit more confused than normal at the moment.

Had a play this morning and it worked well! Awesome work and thanks for the super quick fix!