databricks / databricks-sql-go

Golang database/sql driver for Databricks SQL.
Apache License 2.0
36 stars 40 forks source link

How to make transfer data faster while using databricks-sql-go #235

Open calebeaires opened 1 month ago

calebeaires commented 1 month ago

I am using the driver to run a data migration. When dealing with a table of 24 million rows and 9 columns, the performance is excellent when I fetch 10 thousand rows. When I increase the fetch size to 100 thousand rows, the transfer speed is still good. However, when fetching 1 million rows or more, the data transfer becomes very slow. A quick test made me think that the driver tries to fetch all the data in a single batch. Is there a way to improve this process?

This is the connection string


"token:xxx@host:443$xxx-path?catalog=sample&database=big_table&useCloudFetch=true&maxRows=10000"

I've tried to use this two settings, data transfer is still slow to big tables.

In forums users suggest change spark.driver.maxResultSize ?

kravets-levko commented 1 month ago

@calebeaires considering you enabled CloudFetch - most likely your assumption is correct, the driver actually tries to download all the data in memory. That's what we discovered recently, and are trying to fix right now - databricks/databricks-sql-go#234 Try to disable CloudFetch, this helped some other users. And then you can play with maxRows to see if it has any effect in your case

calebeaires commented 1 month ago

@kravets-levko I've changed the settings to useCloudFetch=false, still getting errors. Sometimes this erros come out:


execution error: failed to execute query: unexpected operation state ERROR_STATE: Total size of serialized results of 382 tasks (4.0 GiB) is bigger than spark.driver.maxResultSize 4.0 GiB. Code: 108

Topic: Spark Driver maxResultSize exceed 4.0 GiB

Hope we find a solution!

kravets-levko commented 1 month ago

@calebeaires for this particular error please reach out an administrator of your workspace, or Databricks support. This one is related to your workspace configuration, and that's not something that can be handled in library

calebeaires commented 1 month ago

Thank you so much. Please, help me elaborate the issue to the administrator. Is there some config the admin can define concerning max memory result size?