GoogleCloudPlatform / pbmm-on-gcp-onboarding

GCP Canadian Public Sector Landing Zone overlay on top of the TEF via CFT modules - a secure cloud foundation
https://cloud.google.com/architecture/security-foundations
Apache License 2.0
45 stars 56 forks source link

Routing logs from log sinks to 3rd party software such as Splunk using PubSub #318

Closed obriensystems closed 7 months ago

obriensystems commented 1 year ago

Use Case is around routing logs to 3rd party software on prem

add centralized logging project bucket as logging sink targets from workloads https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/blob/main/solutions/core-landing-zone/lz-folder/audits/logging-project/cloud-logging-buckets.yaml#L41

follow guidance below https://cloud.google.com/logging/docs/routing/overview#sinks https://cloud.google.com/logging/docs/export/aggregated_sinks https://cloud.google.com/logging/docs/export/pubsub https://cloud.google.com/storage/docs/pubsub-notifications https://cloud.google.com/architecture/monitoring

Splunk Log Sink and Filtering https://cloud.google.com/architecture/stream-logs-from-google-cloud-to-splunk https://registry.terraform.io/modules/terraform-google-modules/log-export/google/latest/examples/splunk-sink

https://cloud.google.com/architecture/security-foundations/logging-monitoring

20231015 work starting on older branch v20230917 adding an alternate org sink until the main branch is stabilized https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/releases/tag/v20230917 see #332

Branching off 332 branch for PRs related to log sinks

Screenshot 2023-10-15 at 19 18 35

use

git checkout 318-log-sink-alerting

diff between 332 branch https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/pull/334/files

diff between main branch https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/pull/333/files

Architecture

Current state

{"insertId":"1nnt445c5pn","logName":"organizations/131880894992/logs/cloudaudit.googleapis.com%2Factivity","protoPayload":{"@type":"type.googleapis.com/google.cloud.audit.AuditLog","authenticationInfo":{"principalEmail":"tfsa0131@tzpe-tlz-tlz-de.iam.gserviceaccount.com","serviceAccountDelegationInfo":[{"firstPartyPrincipal":{"principalEmail":"407485305437@cloudbuild.gserviceaccount.com"}},{"firstPartyPrincipal":{"principalEmail":"cloud-build-argo-foreman@prod.google.com"}}]},"authorizationInfo":[{"granted":true,"permission":"accesscontextmanager.policies.create","resource":"organizations/131880894992","resourceAttributes":{}}],"metadata":{"idaasCustomerId":"C02e6x9c7","userId":"101414391001446101002"},"methodName":"google.identity.accesscontextmanager.v1.AccessContextManager.CreateAccessPolicy","request":{"@type":"type.googleapis.com/google.identity.accesscontextmanager.v1.AccessPolicy","parent":"organizations/131880894992"},"requestMetadata":{"callerIp":"34.67.85.47","callerSuppliedUserAgent":"Terraform/1.0.10 (+https://www.terraform.io) Terraform-Plugin-SDK/2.5.0 terraform-provider-google/3.90.1,gzip(gfe),gzip(gfe)","destinationAttributes":{},"requestAttributes":{"auth":{},"time":"2023-01-31T17:03:06.538767Z"}},"resourceName":"organizations/131880894992","serviceName":"accesscontextmanager.googleapis.com","status":{"code":6,"message":"Policy already exists with parent organizations/131880894992"}},"receiveTimestamp":"2023-01-31T17:03:06.933885627Z","resource":{"labels":{"method":"google.identity.accesscontextmanager.v1.AccessContextManager.CreateAccessPolicy","service":"accesscontextmanager.googleapis.com"},"type":"audited_resource"},"severity":"ERROR","timestamp":"2023-01-31T17:03:06.458004Z"}

or a later PSC endpoint https://storage.cloud.google.com/tzpeaudittlz/cloudaudit.googleapis.com/activity/2023/09/06/14%3A00%3A00_14%3A59%3A59_S1.json?_ga=2.86505105.-885096055.1674837219

{"insertId":"j59qz8e92zgy","logName":"projects/tzpe-tlz-tlzprod-host4/logs/cloudaudit.googleapis.com%2Factivity","protoPayload":{"@type":"type.googleapis.com/google.cloud.audit.AuditLog","authenticationInfo":{"principalEmail":"tfsa0131@tzpe-tlz-tlz-de.iam.gserviceaccount.com","principalSubject":"serviceAccount:tfsa0131@tzpe-tlz-tlz-de.iam.gserviceaccount.com","serviceAccountDelegationInfo":[{"firstPartyPrincipal":{"principalEmail":"407485305437@cloudbuild.gserviceaccount.com"}},{"firstPartyPrincipal":{"principalEmail":"cloud-build-argo-foreman@prod.google.com"}}]},"authorizationInfo":[{"granted":true,"permission":"compute.globalForwardingRules.create","resourceAttributes":{"name":"projects/tzpe-tlz-tlzprod-host4/global/forwardingRules/psc-incoming","service":"compute","type":"compute.globalForwardingRules"}},{"granted":true,"permission":"compute.globalForwardingRules.pscCreate","resourceAttributes":{"name":"projects/tzpe-tlz-tlzprod-host4/global/forwardingRules/psc-incoming","service":"compute","type":"compute.globalForwardingRules"}},{"granted":true,"permission":"compute.networks.use","resourceAttributes":{"name":"projects/tzpe-tlz-tlzprod-host4/global/networks/tzpecnr-tlzprod-svpc-vpc","service":"compute","type":"compute.networks"}},{"granted":true,"permission":"compute.globalAddresses.use","resourceAttributes":{"name":"projects/tzpe-tlz-tlzprod-host4/global/addresses/global-psconnect-ip","service":"compute","type":"compute.globalAddresses"}}],"methodName":"beta.compute.globalForwardingRules.insert","request":{"@type":"type.googleapis.com/compute.globalForwardingRules.insert","IPAddress":"projects/tzpe-tlz-tlzprod-host4/global/addresses/global-psconnect-ip","name":"psc-incoming","network":"projects/tzpe-tlz-tlzprod-host4/global/networks/tzpecnr-tlzprod-svpc-vpc","target":"all-apis"},"requestMetadata":{"callerIp":"34.27.112.162","callerSuppliedUserAgent":"Terraform/1.0.10 (+https://www.terraform.io) Terraform-Plugin-SDK/2.10.1 terraform-provider-google-beta/4.81.0 blueprints/terraform/terraform-google-network:private-service-connect/v7.3.0,gzip(gfe)","destinationAttributes":{},"requestAttributes":{"auth":{},"time":"2023-09-06T14:08:18.136012Z"}},"resourceLocation":{"currentLocations":["global"]},"resourceName":"projects/tzpe-tlz-tlzprod-host4/global/forwardingRules/psc-incoming","response":{"@type":"type.googleapis.com/error","error":{"code":400,"errors":[{"domain":"global","message":"Invalid value for field 'resource.name': 'psc-incoming'. The forwarding rule name for PSC Google APIs must be an 1-20 characters string with lowercase letters and numbers and must start with a letter.","reason":"invalid"}],"message":"Invalid value for field 'resource.name': 'psc-incoming'. The forwarding rule name for PSC Google APIs must be an 1-20 characters string with lowercase letters and numbers and must start with a letter."}},"serviceName":"compute.googleapis.com","status":{"code":3,"message":"Invalid value for field 'resource.name': 'psc-incoming'. The forwarding rule name for PSC Google APIs must be an 1-20 characters string with lowercase letters and numbers and must start with a letter."}},"receiveTimestamp":"2023-09-06T14:08:18.670124354Z","resource":{"labels":{"forwarding_rule_id":"","project_id":"tzpe-tlz-tlzprod-host4","region":"global"},"type":"gce_forwarding_rule"},"severity":"ERROR","timestamp":"2023-09-06T14:08:18.041559Z"}

Future state

Org sink https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_organization_sink

Project sink https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_project_sink

obriensystems commented 1 year ago

A good example of bug chasing for 30 min (nothing related to actual coding dev or devops) - just a typo Hint: was not an issue with any of the actual terraform yaml code

Screenshot 2023-10-15 at 21 12 28 Screenshot 2023-10-15 at 21 12 55

fixed with a single file change

obriensystems commented 1 year ago

Applying log sink on prod project to start - with a target of existing bucket - will switch to pubsub

Screenshot 2023-10-15 at 21 52 02 Screenshot 2023-10-15 at 21 52 34

preview logs from the router pane

Screenshot 2023-10-15 at 21 57 50

Step #3 - "tf plan": Terraform will perform the following actions:
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.project-level-log-sink.google_logging_project_sink.my-sink will be created
Step #3 - "tf plan":   + resource "google_logging_project_sink" "my-sink" {
Step #3 - "tf plan":       + destination            = "logging.googleapis.com/projects/tzpe-tlz-audittlz-tlz/locations/northamerica-northeast1/buckets/20231015tlz"
Step #3 - "tf plan":       + filter                 = "resource.type = gce_instance AND severity >= INFO"
Step #3 - "tf plan":       + id                     = (known after apply)
Step #3 - "tf plan":       + name                   = "20231015-sink"
Step #3 - "tf plan":       + project                = "tzpe-tlz-tlzprod-host4"
Step #3 - "tf plan":       + unique_writer_identity = true
Step #3 - "tf plan":       + writer_identity        = (known after apply)
Step #3 - "tf plan": 
Step #3 - "tf plan":       + bigquery_options {
Step #3 - "tf plan":           + use_partitioned_tables = (known after apply)
Step #3 - "tf plan":         }
Step #3 - "tf plan":     }
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.service_accounts.data.template_file.keys["sa"] will be read during apply
Step #3 - "tf plan":   # (config refers to values not yet known)
Step #3 - "tf plan":  <= data "template_file" "keys"  {
Step #3 - "tf plan":       ~ id       = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855" -> (known after apply)
Step #3 - "tf plan":       + rendered = (known after apply)
Step #3 - "tf plan":         # (2 unchanged attributes hidden)
Step #3 - "tf plan":     }
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.net-host-prj.module.project.google_project.project will be updated in-place
Step #3 - "tf plan":   ~ resource "google_project" "project" {
Step #3 - "tf plan":         id                  = "projects/tzpe-tlz-tlzprod-host4"
Step #3 - "tf plan":       ~ labels              = {
Step #3 - "tf plan":           - "date_modified" = "2023-10-16"
Step #3 - "tf plan":         } -> (known after apply)
Step #3 - "tf plan":         name                = "TzPe-tlz-tlzprod-host4"
Step #3 - "tf plan":         # (5 unchanged attributes hidden)
Step #3 - "tf plan":     }
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.net-host-prj.module.network["tlzprod-svpc"].module.subnets["prsubnet02"].google_compute_subnetwork.subnetwork will be updated in-place
Step #3 - "tf plan":   ~ resource "google_compute_subnetwork" "subnetwork" {
Step #3 - "tf plan":         id                         = "projects/tzpe-tlz-tlzprod-host4/regions/northamerica-northeast1/subnetworks/tzpecnr-prsubnet02-host4-snet"
Step #3 - "tf plan":         name                       = "tzpecnr-prsubnet02-host4-snet"
Step #3 - "tf plan":         # (13 unchanged attributes hidden)
Step #3 - "tf plan": 
Step #3 - "tf plan":       ~ log_config {
Step #3 - "tf plan":           - metadata             = "EXCLUDE_ALL_METADATA" -> null
Step #3 - "tf plan":             # (4 unchanged attributes hidden)
Step #3 - "tf plan":         }
Step #3 - "tf plan":     }
Step #3 - "tf plan": 
Step #3 - "tf plan": Plan: 1 to add, 2 to change, 0 to destroy.

module.service_accounts.data.template_file.keys["sa"]: Read complete after 0s [id=e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855]
module.net-host-prj.module.network["tlzprod-svpc"].module.subnets["prsubnet02"].google_compute_subnetwork.subnetwork: Modifying... [id=projects/tzpe-tlz-tlzprod-host4/regions/northamerica-northeast1/subnetworks/tzpecnr-prsubnet02-host4-snet]
module.net-host-prj.module.network["tlzprod-svpc"].module.subnets["prsubnet02"].google_compute_subnetwork.subnetwork: Modifications complete after 1s [id=projects/tzpe-tlz-tlzprod-host4/regions/northamerica-northeast1/subnetworks/tzpecnr-prsubnet02-host4-snet]
module.project-level-log-sink.google_logging_project_sink.my-sink: Creating...
module.project-level-log-sink.google_logging_project_sink.my-sink: Creation complete after 2s [id=projects/tzpe-tlz-tlzprod-host4/sinks/20231015-sink]

fixed hardcoded audit project - to pick up prod shared VPC project, removed gce specific log filter, moved bucket to prod from audit. Pending is code to create the bucket

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/storage_bucket

Screenshot 2023-10-15 at 22 20 43

we can target splunk

Screenshot 2023-10-15 at 21 58 58
obriensystems commented 1 year ago

Logging bucket vs cloud storage bucket https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_project_bucket_config

Screenshot 2023-10-15 at 22 43 37
resource "google_logging_project_bucket_config" "prod-log-sink-bucket" {
    project          = var.project_id
    location         = var.region1
    retention_days   = 30
    #enable_analytics = true # N/A yet
    bucket_id        = var.bucket_name
}

Step #4 - "tf apply": module.project-level-log-sink.google_logging_project_bucket_config.analytics-enabled-bucket: Creating...
Step #4 - "tf apply": module.project-level-log-sink.google_logging_project_bucket_config.analytics-enabled-bucket: Creation complete after 1s [id=projects/tzpe-tlz-tlzprod-host4/locations/northamerica-northeast1/buckets/20231015-prod-sink]

redeploying with renamed sinks, buckets Note: log storage bucket does not get removed - if the variable name is also changed https://console.cloud.google.com/logs/storage?project=tzpe-tlz-tlzprod-host4&supportedpurview=project

obriensystems commented 1 year ago

Create GCS bucket for 2nd GCS log sink destination - remove filter https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/storage_bucket

Screenshot 2023-10-15 at 23 35 38 Screenshot 2023-10-15 at 23 36 06 Screenshot 2023-10-15 at 23 36 29

logs take up to an hour to show - do some VM start/stop to generate logs first

Screenshot 2023-10-15 at 23 38 12 Screenshot 2023-10-15 at 23 37 32

preview logs

Screenshot 2023-10-15 at 23 39 04
obriensystems commented 1 year ago

PR 1 of 2-3

root_@cloudshell:~/lz-tls/_lz2/_upsource/pbmm-on-gcp-onboarding (lz-tls)$ git status
On branch 318-log-sink-alerting
Your branch is up to date with 'origin/318-log-sink-alerting'.

Changes to be committed:
  (use "git restore --staged <file>..." to unstage)
        modified:   environments/prod/main.tf
        new file:   environments/prod/prod-logging.auto.tfvars
        modified:   environments/prod/variables.tf
        new file:   modules/23-logging/main.tf
        new file:   modules/23-logging/outputs.tf
        new file:   modules/23-logging/variables.tf
        new file:   modules/24-gcs-bucket/main.tf
        new file:   modules/24-gcs-bucket/outputs.tf
        new file:   modules/24-gcs-bucket/variables.tf

        root_@cloudshell:~/lz-tls/_lz2/_upsource/pbmm-on-gcp-onboarding (lz-tls)$ git push origin 318-log-sink-alerting
Username for 'https://github.com': obriensystems
Password for 'https://obriensystems@github.com': 
Enumerating objects: 20, done.
Counting objects: 100% (20/20), done.
Delta compression using up to 4 threads
Compressing objects: 100% (14/14), done.
Writing objects: 100% (14/14), 3.11 KiB | 1.55 MiB/s, done.
Total 14 (delta 9), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (9/9), completed with 6 local objects.
To https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding.git
   2caef98..c6edbf6  318-log-sink-alerting -> 318-log-sink-alerting

https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/pull/334

Buckets coming up

Screenshot 2023-11-07 at 08 12 13
obriensystems commented 1 year ago

Review KCC version of our logging project and it's sinks (61GB) https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/blob/main/solutions/client-landing-zone/logging-project/cloud-logging-bucket.yaml

Screenshot 2023-10-16 at 10 21 40
obriensystems commented 1 year ago

branches in use

Revisit requirements and dev environment

folder level logs experimentation using gcloud instead of terraform for throughput

References

https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/issues/634 https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/issues/446

requirements

obriensystems commented 1 year ago

terraform.landing.systems buckets via terraform are up

Screenshot 2023-11-07 at 08 11 35

storage bucket and logging bucket created for 2 routers in bigquery-ol at the org scope 20231106:1630

Screenshot 2023-11-06 at 16 43 57

1705

Screenshot 2023-11-06 at 17 05 32

GCS entries up later

Screenshot 2023-11-07 at 08 50 12

sink details

Screenshot 2023-11-07 at 09 07 55 Screenshot 2023-11-07 at 09 08 23 Screenshot 2023-11-07 at 09 08 44

comparing

Screenshot 2023-11-07 at 09 10 26 Screenshot 2023-11-07 at 09 10 46 Screenshot 2023-11-07 at 09 11 40

permissions

Screenshot 2023-11-07 at 09 18 12

busted org

Screenshot 2023-11-07 at 09 17 59

working org

Screenshot 2023-11-07 at 10 17 40
obriensystems commented 1 year ago

Reviewing the results of https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/pull/333/files#diff-2f2b3d2889b990647e43f24f860b0d08898940ea273d175ae257fed4339431f7

Noticed that the GCS sink is routing to a log router instead of a GCS storage bucket - hence why log storage is filled but the GCS bucket is not

log router

Screenshot 2023-11-07 at 08 26 52

prod-log-sink

Screenshot 2023-11-07 at 08 27 22

prod-log-gcs-sink

Screenshot 2023-11-07 at 08 27 40

log storage

Screenshot 2023-11-07 at 08 28 05

prod-sink-gcs

Screenshot 2023-11-07 at 08 29 13

sink details - GCS

Screenshot 2023-11-07 at 09 03 44

Sink details - log storage

Screenshot 2023-11-07 at 09 04 10

triage switch

gcs_bucket_name = "20231015-prod-sink-gcs"

resource "google_logging_project_sink" "prod-log-sink-to-gcs-bucket" {
  name = var.gcs_sink_name
  project = var.project_id

  # Can export to pubsub, cloud storage, bigquery, log bucket, or another project
  #destination = "pubsub.googleapis.com/projects/my-project/topics/instance-activity"
  #destination = "logging.googleapis.com/projects/${var.project_id}/locations/${var.region1}/buckets/${var.gcs_bucket_name}"
  destination = "storage.googleapis.com/${var.gcs_bucket_name}" 

resource "google_logging_project_sink" "prod-log-sink-to-log-bucket" {
  name = var.log_sink_name
  project = var.project_id

  # Can export to pubsub, cloud storage, bigquery, log bucket, or another project
  #destination = "pubsub.googleapis.com/projects/my-project/topics/instance-activity"
  destination = "logging.googleapis.com/projects/${var.project_id}/locations/${var.region1}/buckets/${var.log_bucket_name}"

  #destination = "storage.googleapis.com/[GCS_BUCKET]"
  #destination = "bigquery.googleapis.com/projects/[PROJECT_ID]/datasets/[DATASET]"
  #destination = "pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID]"
  #destination = "logging.googleapis.com/projects/[PROJECT_ID]/locations/global/buckets/[BUCKET_ID]"
  #destination = "logging.googleapis.com/projects/[PROJECT_ID]"

  # Example: Log all WARN or higher severity messages relating to instances
  #filter = "resource.type = gce_instance AND severity >= INFO"
  # filter only by log severity - remember filter is optional
  filter = "severity >= INFO"

  # Use a unique writer (creates a unique service account used for writing)
  unique_writer_identity = true
}

and turn off filter

Before that - found the issue - IAM permissions on the SA
{
errorGroups: [1]
insertId: "1uvlr6ebn7"
labels: {7}
logName: "projects/tzpe-tlz-tlzprod-host4/logs/logging.googleapis.com%2Fsink_error"
receiveTimestamp: "2023-11-02T16:10:31.566259282Z"
resource: {2}
severity: "ERROR"
textPayload: "Cloud Logging sink configuration error in tzpe-tlz-tlzprod-host4, sink 20231015-prod-gcs-sink: bucket_permission_denied ()"
timestamp: "2023-11-02T16:10:30.582419156Z"
Screenshot 2023-11-07 at 09 15 50

Missing SA permissions on the Cloud Storage Bucket

see https://cloud.google.com/logging/docs/export/configure_export_v2#gcloud_3

missing

service-951469276805@gcp-sa-logging.iam.gserviceaccount.com Cloud Logging Service Account for Project 951469276805 Storage Legacy Bucket Owner

get the cloud logging service account SA by determining the project number for the prod4 project

see https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/blob/gh446-hub/solutions/setup.sh#L221

root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ PROJECT_ID=tzpe-tlz-tlzprod-host4
root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ KCC_PROJECT_NUMBER=$(gcloud projects list --filter="${PROJECT_ID}" '--format=value(PROJECT_NUMBER)')
root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ echo $KCC_PROJECT_NUMBER
604049845861

How does the cloud logging service account get auto assigned to the GCS bucket? https://cloud.google.com/logging/docs/buckets#which_service_accounts_are_routing_logs_to_my_bucket "Logs Bucket Writer" in roles off IAM permissions - none there on either org - even though "ol" has the SA

I think we need "Storage Legacy Bucket Owner"

raising separate https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/issues/337

SA is

root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ gcloud logging settings describe --project=$PROJECT_ID
kmsServiceAccountId: cmek-p604049845861@gcp-sa-logging.iam.gserviceaccount.com
loggingServiceAccountId: service-604049845861@gcp-sa-logging.iam.gserviceaccount.com
name: projects/tzpe-tlz-tlzprod-host4/settings

and
root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ gcloud logging settings describe --project=$PROJECT_ID
kmsServiceAccountId: cmek-p604049845861@gcp-sa-logging.iam.gserviceaccount.com
loggingServiceAccountId: service-604049845861@gcp-sa-logging.iam.gserviceaccount.com
name: projects/tzpe-tlz-tlzprod-host4/settings
obriensystems commented 1 year ago

Check existing config flag set to true before we swap in the service account - see if it kicks in the default sa around https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_project_sink

  # Use a unique writer (creates a unique service account used for writing)
  unique_writer_identity = true

change

root_@cloudshell:~/lz-tls/_lz2/pbmm-on-gcp-onboarding (lz-tls)$ git diff
diff --git a/modules/23-logging/main.tf b/modules/23-logging/main.tf
index 943f4b0..8472e25 100644
--- a/modules/23-logging/main.tf
+++ b/modules/23-logging/main.tf
@@ -69,7 +69,7 @@ resource "google_logging_project_sink" "prod-log-sink-to-gcs-bucket" {
   #filter = "severity >= INFO"

   # Use a unique writer (creates a unique service account used for writing)
-  unique_writer_identity = true
+  #unique_writer_identity = true

results - new SA serviceAccount:service-604049845861@gcp-sa-logging.iam.gserviceaccount.com

Step #3 - "tf plan": Terraform will perform the following actions:
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.project-level-log-sink.google_logging_project_sink.prod-log-sink-to-gcs-bucket must be replaced
Step #3 - "tf plan": -/+ resource "google_logging_project_sink" "prod-log-sink-to-gcs-bucket" {
Step #3 - "tf plan":       - disabled               = false -> null
Step #3 - "tf plan":       ~ id                     = "projects/tzpe-tlz-tlzprod-host4/sinks/20231015-prod-gcs-sink" -> (known after apply)
Step #3 - "tf plan":         name                   = "20231015-prod-gcs-sink"
Step #3 - "tf plan":       ~ unique_writer_identity = true -> false # forces replacement
Step #3 - "tf plan":       ~ writer_identity        = "serviceAccount:service-604049845861@gcp-sa-logging.iam.gserviceaccount.com" -> (known after apply)
Step #3 - "tf plan":         # (2 unchanged attributes hidden)
Step #3 - "tf plan": 
Step #3 - "tf plan":       + bigquery_options {
Step #3 - "tf plan":           + use_partitioned_tables = (known after apply)
Step #3 - "tf plan":         }
Step #3 - "tf plan":     }
Step #3 - "tf plan": 
Step #3 - "tf plan":   # module.service_accounts.data.template_file.keys["sa"] will be read during apply
Step #3 - "tf plan":   # (config refers to values not yet known)
Step #3 - "tf plan":  <= data "template_file" "keys"  {
Step #3 - "tf plan":       ~ id       = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855" -> (known after apply)
Step #3 - "tf plan":       + rendered = (known after apply)
Step #3 - "tf plan":         # (2 unchanged attributes hidden)
Step #3 - "tf plan":     }
Step #3 - "tf plan": 

will need to wait a couple hours

Screenshot 2023-11-07 at 11 19 05

1320 - 2h

Screenshot 2023-11-07 at 13 20 14
fmichaelobrien commented 1 year ago

drive/shadow to https://github.com/GoogleCloudPlatform/pubsec-declarative-toolkit/issues/651

fmichaelobrien commented 7 months ago

20240406: Closing issue during retrofit/rebase of this TEF V1 based/modified repo to TEF V4 standards This issue may participate in the LZ refactor after rebase Query on all issues related to the older V1 version via the tag https://github.com/GoogleCloudPlatform/pbmm-on-gcp-onboarding/labels/2024-pre-tef-v4