GoogleCloudPlatform / pubsub2inbox

Pubsub2Inbox is a versatile, multi-purpose tool to handle Pub/Sub messages and turn them into email, API calls, GCS objects, files or almost anything.
Apache License 2.0
34 stars 11 forks source link
cloud-run gcp-cloud-functions google-cloud google-cloud-platform pubsub

Pubsub2Inbox

License Tests

Pubsub2Inbox is a versatile, multi-purpose tool to handle Pub/Sub messages and turn them into email, API calls, CS objects, files or almost anything. It's based on an extendable framework consisting of input and output processors. Input processors can enrich the incoming messages with details (for example, fetching the budget from Cloud Billing Budgets API, calling different GCP API or third party services). Multiple input and output processors can be chained together in a pipeline.

Pubsub2Inbox is written in Python 3.8+ and can be deployed as a Cloud Function v1/v2 or as a Cloud Run function easily. To guard credentials and other sensitive information, the tool can fetch its YAML configuration from Google Cloud Secret Manager.

The tool also supports templating of emails, messages and other parameters through Jinja2 templating, with additional filters and functions added.

Please note: You cannot connect to SMTP port 25 from GCP. Use alternative ports 465 or 587, or connect via Serverless VPC Connector to your own mailservers.

Also check out Json2Pubsub, a complementary any-webhook-to-Pub/Sub tool!

Out of the box experience

Out of the box, you'll have the following functionality available as examples:

Title Example use cases Samples
Vertex AI A Slack bot using Vertex AI Generative AI models. Vertex AI Slack bot (also see blog post)
Multi-modal Gemini Slack bot
Budget alerts Get email if a project's budget exceeds certain limit. For more information, see How to set up programmatic notifications from billing budgets. Budget alerts
Cloud Security Command Center Send emails when a new security finding is made (see how to set up finding notifications from SCC), or create new findings from any Pub/Sub message. Email notifications of findings
Create findings from Cloud IDS
Create custom findings
Containers Synchronize container images from Artifactory to Artifact Registry. Artifactory to Artifact Registry.
Container Analysis Creates GitHub issues automatically on new vulnerabilities in containers. Create issue on Github for vulnerabilities
Slack bot that posts new vulnerabilities
Full Github example
Cloud Storage When a new report arrives in a bucket, send it out as an email attachment. Or copy files to a backup bucket as soon as they arrive. See: How to set up Cloud Storage notifications Cloud Storage notifications
Cloud Storage backup copier
BigQuery Run BigQuery queries on a schedule and turn the results into CSV or spreadsheets and send them out as email attachments. BigQuery queries
Recommendations Generate recommendations and insights for project owner's on a scheduled basis. Uses Recommender API. Recommendations and Insights reports
Example with attached spreadsheet
Example with with GCS and BigQuery output.
Compute Engine Start and stop instances, detach and attach disks, patch load balancer backends. services. Compute Engine instance control
Cloud Monitoring Send alerts from Cloud Monitoring via your own SMTP servers, or use an unsupported messaging platform. Or run Cloud Monitoring MQL queries and send the results. Cloud Monitoring alerts
Service account usage reporting using Cloud Monitoring and Cloud Asset Inventory
OpsGenie alert integration
Cloud Logging Query Cloud Run job logs after execution and email them. Cloud Run job logs
Cloud Asset Inventory Use Cloud Asset Inventory to fetch resources organization-wide. Fetch all service accounts from CAI
Cloud Identity Fetch groups or memberships, or change group settings. For example, build a report of members in a group for review and send it out via email. Cloud Identity groups
Another example
Groups that allow external members
Example of Directory API
Update group default settings on creation
Cloud DNS Add or remove records based on Pub/Sub messages. Add DNS entries
Resource Manager List and search for GCP projects. GCP projects
Secret Manager Fetch secrets from Secret Manager. Retrieve secret
Scripting Run any binary or shell script and parse the output (supports JSON, YAML, CSV, etc.) Shell processor
Utilities Download files using HTTP, FTP or SFTP. Clone Git repositories. Utilities
Transcoder Transcode video and audio using Transcoder API. Transcoding a video
Messaging Send messages to Google Chat or SMS messages. Send SMS messages using Twilio
Cloud Deploy notifications to Google Chat (also see the blog post)
GitHub issues to Google Chat
JSON Generic JSON parser. Generic JSON processing

Input processors

Available input processors are:

For full documentation of permissions, processor input and output parameters, see PROCESSORS.md.

Please note that the input processors have some IAM requirements to be able to pull information from GCP:

Output processors

Available output processors are:

Please note that the output processors have some IAM requirements to be able to pull information from GCP:

For more documentation, see output.md.

Configuring Pubsub2Inbox

Pipeline-based configuration

Pubsub2Inbox is configured through a YAML file (for examples, see the examples/ directory).

The YAML file is structured of the following top level keys:

For example of a modern pipeline, see shell script example or test configs.

Legacy configuration

For legacy configuration details, see LEGACY.

Deploying as Cloud Function

Deploying via Terraform

Sample Terraform module is provided in main.tf, variables.tf and outputs.tf. Pass the following parameters in when using as a module:

Deploying manually

For manual deployment option, see LEGACY.

Deploying via Cloud Run

Prebuilt image

A prebuilt container image is available on this page. The container is signed and the signature can be verified with cosign for example:

cosign verify --key container-signature.pub ghcr.io/googlecloudplatform/pubsub2inbox:latest

Building the container

A Dockerfile has been provided for building the container. You can build the image locally and push it to for example Artifact Registry.

docker build -t europe-west4-docker.pkg.dev/$PROJECT_ID/pubsub2inbox/pubsub2inbox . 
docker push europe-west4-docker.pkg.dev/$PROJECT_ID/pubsub2inbox/pubsub2inbox

Deploying via Terraform

The provided Terraform scripts can deploy the code as a Cloud Function or Cloud Run. To enable Cloud Run deployment, build and push the image and set cloud_run and cloud_run_container parameters (see the parameter descriptions above).

This is a simple example of deploying the function straight from the repository:

locals {
  project_id    = <YOUR-PROJECT-ID>
  region        = "europe-west1"
  helper_bucket = true
}

module "pubsub-topic" {
  source     = "github.com/GoogleCloudPlatform/cloud-foundation-fabric//modules/pubsub"
  project_id = local.project_id
  name       = "pubsub-example-1"
  iam = {}
}

# This optional helper bucket is used to store resend objects for example
module "helper-bucket" {
  count      = local.helper_bucket ? 1 : 0
  source     = "github.com/GoogleCloudPlatform/cloud-foundation-fabric//modules/gcs"
  project_id = local.project_id
  name       = format("pubsub2inbox-helper-%s", module.pubsub2inbox.name)
}

module "pubsub2inbox" {
  source = "github.com/GoogleCloudPlatform/pubsub2inbox"

  project_id = local.project_id
  region     = local.region

  function_name = "function-example-1"
  pubsub_topic  = module.pubsub-topic.id

  config_file     = "<YOUR-CONFIGURATION-FILE>.yaml"
  # Downloads the release from Github
  use_local_files = false

  bucket_name        = format("pubsub2inbox-source-%s", module.pubsub2inbox.name)
  bucket_location    = local.region
  helper_bucket_name = local.helper_bucket ? module.helper-bucket.0.bucket.name : ""

  cloud_functions_v2 = true

  # Add additional permissions for the service account here
  function_roles = []
}

Generating documentation

Run the command:

# make docs

Running tests

Run the command:

# make test

To test against a real cloud project, set PROJECT_ID environment variable.