GoogleCloudPlatform / terraformer

CLI tool to generate terraform files from existing infrastructure (reverse Terraform). Infrastructure to Code
Apache License 2.0
12.09k stars 1.6k forks source link

Only a subset of datadog_logs_custom_pipeline imported #1888

Open darkn3rd opened 1 week ago

darkn3rd commented 1 week ago

When I use terraformer for the datadog provider, only 2 out of 24 logs_custom_pipeline resources are imported.

STEPS

export DATADOG_HOST="https://app.datadoghq.com/apm/home"
export DATADOG_API_KEY="$(awk -F'"' '/datadog_api_key/{ print $2 }' terraform.tfvars)"
export DATADOG_APP_KEY="$(awk -F'"' '/datadog_app_key/{ print $2 }' terraform.tfvars)"
terraformer import datadog --resources='*' 

ACTUAL RESULTS

Only 2 resources imported.

grep ^resource ./generated/datadog/logs_custom_pipeline/logs_custom_pipeline.tf | wc -l
# 2

EXPECTED RESULTS

I expected 24 resources to be imported.

Using the pipelines api, I can get 24 resources using the same key.

curl --silent \
  --request GET https://api.datadoghq.com/api/v1/logs/config/pipelines \
  --header Accept: application/json \
  --header "DD-API-KEY: ${DATADOG_API_KEY}" \
  --header "DD-APPLICATION-KEY: ${DATADOG_APP_KEY}" \
  > output.json

jq -r '.[].name' output.json | wc -l
# 24
darkn3rd commented 1 week ago

Upon further research, pipelines that have is_read_only set to true were not included. It would be nice to have this behavior documented.