Closed nj1973 closed 4 months ago
For info I've tested up to --random-row-batch-size=50000 on Oracle, BigQuery and PostgreSQL and these all complete successfully for standard data types.
I found and fixed a bug in the previous get_max_in_list_size()
code.
When validating with
--use-random-row --random-row-batch-size=1001
The validation fails with:
For this specific error we can build upon the recent enhancement from https://github.com/GoogleCloudPlatform/professional-services-data-validator/issues/1146 and add a 1000 IN list limit for Oracle connections.
But we should also think about why this option was added and what sensible limits we should build in.