Open robinmetral opened 8 months ago
Found my own workaround.
Since adding this data worked via Workers in the first place (there doesn't seem to be any statement length limit there), I built one for batch importing the data (8 MB). Here it is, in case anyone runs into the same issue:
Which Cloudflare product(s) does this pertain to?
D1
What version(s) of the tool(s) are you using?
wrangler 3.17.1
What version of Node are you using?
19.9.0
What operating system are you using?
Linux (Ubuntu)
Describe the Bug
I'm trying to migrate a D1 DB using the legacy backend to the new backend but I get an error from the Cloudflare API:
statement too long
. The DB has a single table with only ~150 rows but some columns contain a lot of text, so Iย imagine that one or more rows must hit the API's statement length limit.Is this expected? Is there another way to import larger rows into D1?
Please provide a link to a minimal reproduction
Not applicable
Please provide any relevant error logs