Tested with an Oracle CLOB value larger than 100MB. Offload fails with:
Load staged data
Fri Dec 8 14:42:57 2023
BigQuery SQL: INSERT INTO `project-01.owner.tab_clob`
SELECT `ID` AS `ID`
, `DATA` AS `DATA`
, PARSE_DATETIME('%F %H:%M:%E*S', `LOAD_DATE`) AS `LOAD_DATE`
FROM `project-01.owner_load.tab_clob`
Step time: 0:00:15
Unhandled exception in offload_table(): 400 Cannot query rows larger than 100MB limit.
We should add a LENGTH check to the Validate staged data step for any BLOB/CLOB columns.
DBA_TABLES.AVG_ROW_LEN does not reflect out of row LOB space used which makes it tricky for us to warn the user at the start of an Offload.
Tested with an Oracle CLOB value larger than 100MB. Offload fails with:
We should add a LENGTH check to the
Validate staged data
step for any BLOB/CLOB columns.DBA_TABLES.AVG_ROW_LEN does not reflect out of row LOB space used which makes it tricky for us to warn the user at the start of an Offload.