Configuration values (redact info where appropriate):
BigQuery Dataset location->us
BigQuery Project ID->xxxxxxxxxxxxxxxxx
Collection path->xxxxxxxxxxxxxxxxxxx
Enable Wildcard Column field with Parent Firestore Document IDs (Optional)->false
Dataset ID->firestore_events
Table ID->events
BigQuery SQL table Time Partitioning option type (Optional)->NONE
BigQuery Time Partitioning column name (Optional)->Parameter not set
Firestore Document field name for BigQuery SQL Time Partitioning field option (Optional)->Parameter not set
BigQuery SQL Time Partitioning table schema field(column) type (Optional)->omit
BigQuery SQL table clustering (Optional)->Parameter not set
Maximum number of synced documents per second (Optional)->100
Backup Collection Name (Optional)->Parameter not set
Transform function URL (Optional)->Parameter not set
Use new query syntax for snapshots->yes
Exclude old data payloads (Optional)->yes
Use Collection Group query (Optional)->no
Cloud KMS key name (Optional)->Parameter not set
[REQUIRED] Step 3: Describe the problem
We cannot import existing data that may be up to 900KB.
Steps to reproduce:
Using the extension configured on some collection
Reconfigure the extension by setting EXCLUDE_OLD_DATA set to true
When reconfiguring, confirm DO_BACKFILL is no longer available
Create a document that is 900KB
Confirm the document does not sync and fails with error task size too large
Run the script fs-bq-import-collection to try to import existing data
Confirm error Request Entity Too Large
During installation of the extension cannot backfill existing data because DO_BACKFILL is currently disabled by #2005.
Expected result
DO_BACKFILL is available during both installation and reconfiguration of the extension
script fs-bq-import-collection does not throw Request Entity Too Large error if we are ignoring old data by setting EXCLUDE_OLD_DATA flag to true
Actual result
DO_BACKFILL is not available when the extension is installed nore when it is reconfigured
script fs-bq-import-collection fails with Request ENtity too Large
[REQUIRED] Step 2: Describe your configuration
[REQUIRED] Step 3: Describe the problem
We cannot import existing data that may be up to 900KB.
Steps to reproduce:
true
task size too large
fs-bq-import-collection
to try to import existing dataRequest Entity Too Large
During installation of the extension cannot backfill existing data because DO_BACKFILL is currently disabled by #2005.
Expected result
fs-bq-import-collection
does not throwRequest Entity Too Large
error if we are ignoring old data by setting EXCLUDE_OLD_DATA flag to trueActual result
fs-bq-import-collection
fails withRequest ENtity too Large