I am using spark-snowflake connector to write data into snowflake environment using jdbc connection.
I am able to write small dataframes into snowflake ~300 records
but my job is getting stuck for anything above 3500 records
the job is working fine when I run in local mode via intellij
Is there any option to improve write performance. I tried SnowflakeConnectorUtils.enablePushdownSession(spark) still facing same issue.
spark version 2.4.4.2-mapr-630
Scala version 2.11.12
--packages org.scalaj:scalaj-http_2.11:2.3.0,net.snowflake:spark-snowflake_2.11:2.9.3-spark_2.4,net.snowflake:snowflake-jdbc:3.13.14,net.snowflake:spark-snowflake_2.11:2.9.3-spark_2.4
Is the writer suitable for writing production jobs for Billions of records
I am using spark-snowflake connector to write data into snowflake environment using jdbc connection.
I am able to write small dataframes into snowflake ~300 records but my job is getting stuck for anything above 3500 records
the job is working fine when I run in local mode via intellij Is there any option to improve write performance. I tried SnowflakeConnectorUtils.enablePushdownSession(spark) still facing same issue.
spark version 2.4.4.2-mapr-630 Scala version 2.11.12 --packages org.scalaj:scalaj-http_2.11:2.3.0,net.snowflake:spark-snowflake_2.11:2.9.3-spark_2.4,net.snowflake:snowflake-jdbc:3.13.14,net.snowflake:spark-snowflake_2.11:2.9.3-spark_2.4
Is the writer suitable for writing production jobs for Billions of records