CODAIT / spark-netezza

Netezza Connector for Apache Spark
Apache License 2.0
13 stars 7 forks source link

NZ Write Feature #6

Open rama-mullapudi opened 8 years ago

rama-mullapudi commented 8 years ago

Can the package be extended for writing from spark to NZ using external table. I tried writing to csv from spark and then running NZ insert from Spark using external source, it works fine. val sql = con.prepareStatement(s"INSERT INTO foo SELECT * FROM EXTERNAL '$file' USING (REMOTESOURCE 'JDBC' DELIMITER ',' QUOTEDVALUE DOUBLE ENCODING 'internal' FORMAT 'Text' )")

If we can use named pipes instead of writing to physical file this can be even more efficient and can run in parallel instead of on master node.

sureshthalamati commented 8 years ago

Thank you for the input Rama. I am working on this currently.

romanyusfin commented 8 years ago

That would be a very useful enhancement. How's the progress on it? Thank you, Roman

bquinart commented 8 years ago

This would indeed be a nice feature. Are you still working on this? Thanks Bruno

fajarnugraha commented 6 years ago

Write feature would be awesome. Currently a workaround is to use sqoop with --direct, which doesn't work for parquet/orc.

Last commit is over a year ago though. Is this project abandoned?