GoogleCloudPlatform / dlp-dataflow-deidentification

Multi Cloud Data Tokenization Solution By Using Dataflow and Cloud DLP
Apache License 2.0
89 stars 53 forks source link

Command to run Dataflow in deploy-data-tokeninzation-solution.sh returns error #75

Open wicaksana opened 3 years ago

wicaksana commented 3 years ago

When I run the script, this line returns the following error:

+ jobId=demo-dlp-deid-pipeline-20210908-064259
+ gcloud dataflow jobs run demo-dlp-deid-pipeline-20210908-064259 --gcs-location gs://dataflow-templates/latest/Stream_DLP_GCS_Text_to_BigQuery --parameters --region=us-central1,inputFilePattern=gs://<project_id>-demo2-demo-data/CCRecords_1564602825.csv,dlpProjectId=<project_id>,deidentifyTemplateName=projects/<project_id>/deidentifyTemplates/dlp-demo-deid-latest-1631083353071,inspectTemplateName=projects/<project_id>/inspectTemplates/dlp-demo-inspect-latest-1631083353071,datasetName=demo_dataset,batchSize=500
ERROR: (gcloud.dataflow.jobs.run) argument --parameters: expected one argument
Usage: gcloud dataflow jobs run JOB_NAME --gcs-location=GCS_LOCATION [optional flags]
  optional flags may be  --additional-experiments | --dataflow-kms-key |
                         --disable-public-ips | --enable-streaming-engine |
                         --help | --max-workers | --network | --num-workers |
                         --parameters | --region | --service-account-email |
                         --staging-location | --subnetwork |
                         --worker-machine-type | --worker-region |
                         --worker-zone | --zone

Correct command that I have tried:

gcloud dataflow jobs run ${jobId} --gcs-location gs://dataflow-templates/latest/Stream_DLP_GCS_Text_to_BigQuery --region=us-central1 --parameters "inputFilePattern=gs://${DATA_STORAGE_BUCKET}/CCRecords_1564602825.csv,dlpProjectId=${PROJECT_ID},deidentifyTemplateName=${DEID_TEMPLATE_NAME},inspectTemplateName=${INSPECT_TEMPLATE_NAME},datasetName=${BQ_DATASET_NAME},batchSize=500"
mehran702 commented 3 years ago

Can confirm, same issue here.