Open darkn3rd opened 1 week ago
When I use terraformer for the datadog provider, only 2 out of 24 logs_custom_pipeline resources are imported.
terraformer
logs_custom_pipeline
export DATADOG_HOST="https://app.datadoghq.com/apm/home" export DATADOG_API_KEY="$(awk -F'"' '/datadog_api_key/{ print $2 }' terraform.tfvars)" export DATADOG_APP_KEY="$(awk -F'"' '/datadog_app_key/{ print $2 }' terraform.tfvars)" terraformer import datadog --resources='*'
Only 2 resources imported.
grep ^resource ./generated/datadog/logs_custom_pipeline/logs_custom_pipeline.tf | wc -l # 2
I expected 24 resources to be imported.
Using the pipelines api, I can get 24 resources using the same key.
curl --silent \ --request GET https://api.datadoghq.com/api/v1/logs/config/pipelines \ --header Accept: application/json \ --header "DD-API-KEY: ${DATADOG_API_KEY}" \ --header "DD-APPLICATION-KEY: ${DATADOG_APP_KEY}" \ > output.json jq -r '.[].name' output.json | wc -l # 24
Upon further research, pipelines that have is_read_only set to true were not included. It would be nice to have this behavior documented.
When I use
terraformer
for the datadog provider, only 2 out of 24logs_custom_pipeline
resources are imported.STEPS
ACTUAL RESULTS
Only 2 resources imported.
EXPECTED RESULTS
I expected 24 resources to be imported.
Using the pipelines api, I can get 24 resources using the same key.