Open rama-mullapudi opened 8 years ago
Thank you for the input Rama. I am working on this currently.
That would be a very useful enhancement. How's the progress on it? Thank you, Roman
This would indeed be a nice feature. Are you still working on this? Thanks Bruno
Write feature would be awesome. Currently a workaround is to use sqoop with --direct, which doesn't work for parquet/orc.
Last commit is over a year ago though. Is this project abandoned?
Can the package be extended for writing from spark to NZ using external table. I tried writing to csv from spark and then running NZ insert from Spark using external source, it works fine. val sql = con.prepareStatement(s"INSERT INTO foo SELECT * FROM EXTERNAL '$file' USING (REMOTESOURCE 'JDBC' DELIMITER ',' QUOTEDVALUE DOUBLE ENCODING 'internal' FORMAT 'Text' )")
If we can use named pipes instead of writing to physical file this can be even more efficient and can run in parallel instead of on master node.