MeltanoLabs / target-snowflake

Singer Target for the Snowflake cloud Data Warehouse
https://hub.meltano.com/loaders/target-snowflake--meltanolabs/
Other
10 stars 24 forks source link

JSON "document is too large" error #275

Open menzenski opened 1 month ago

menzenski commented 1 month ago

sqlalchemy.exc.ProgrammingError: (snowflake.connector.errors.ProgrammingError) 100069 (22P02): 01b7c685-0002-1d43-0004-fb6a0dcbae1a: Error parsing JSON: document is too large, max size 16777216 bytes

Ran into this today and I don't know how to resolve it. I thought at first that this was related to our tap-mongodb source extractor (as a MongoDB BSON document also has a max 16MB size) but some googling turned up this SO post which suggests that this error relates to the Snowflake loading from a stage, in a more general way.

edgarrmondragon commented 1 month ago

Someone shared they were running into this same problem in a past Office Hours, and it was the motivation for this short exploration of excluding large JSON values: https://github.com/MeltanoLabs/meltano-map-transform/pull/300.

One option folks could try to implement is adding ON_ERROR=CONTINUE to the insert statement. Of course, ideas and PRs are welcome.