Open wenfang6 opened 2 days ago
@wenfang6 We insert a fake row to support native write, but it falls back to the vanilla Spark writer here. It seems that
isNativeApplicable
is not set correctly. Does your code include this patch? And can you help to provide the reproduced sql? Thanks.
simple sql also has this error, like :
insert overwrite table wen_test_par1 partition (ds = '2024-10-09')
select * from wen_test;
gluten plan :
== Fallback Summary ==
No fallback nodes
== Physical Plan ==
Execute InsertIntoHiveTable (4)
+- FakeRowAdaptor (3)
+- ^ NativeScan hive dap_dev.wen_test (1)
we use spark 3.2.1
@wenfang6 Gluten native writer in spark 321 overwrite vanilla spark HiveFileFormat
class. Therefore, you must ensure that the gluten jar is loaded prior to the vanilla spark jar. You can refer the this document to configure. Thanks.
@wenfang6 Gluten native writer in spark 321 overwrite vanilla spark
HiveFileFormat
class. Therefore, you must ensure that the gluten jar is loaded prior to the vanilla spark jar. You can refer the this document to configure. Thanks.
l try it, but Still haven't use native write. plan like this
== Fallback Summary ==
No fallback nodes
== Physical Plan ==
CommandResult (1)
+- Execute InsertIntoHiveTable (5)
+- VeloxColumnarToRowExec (4)
+- ^ NativeScan hive dap_dev.wen_test (2)
@wenfang6 Does the above issue is fixed based on this document ? Also native write doesn't support complex type. Does your sql contain complex type?
@wenfang6 Does the above issue is fixed based on this document ? Also native write doesn't support complex type. Does your sql contain complex type?
yeah, the above issue is fixed. but haven't use native write. sql doesn't contain complex type.
@wenfang6 Does this config spark.gluten.sql.native.writer.enabled
enabled in your env? The default value is false.
@wenfang6 Does this config
spark.gluten.sql.native.writer.enabled
enabled in your env? The default value is false.
l set the conf spark.gluten.sql.native.hive.writer.enabled=true
Backend
VL (Velox)
Bug description
run sql : insert overwrite table xx partition (ds = 'xx') select * from xx . There is an error message:
table format is parquet . I would like to know if native write is currently supported. for Insertintohivetable.
Spark version
Spark-3.2.x
Spark configurations
spark.gluten.sql.native.writer.enabled=true
System information
No response
Relevant logs
No response