Open turb opened 2 months ago
For CSV, we've actually relied on the katan library. It looks to be possible to implement the IO simply with a RowDecoder
.
We'd be very happy to see a new contribution from your side!
For CSV, we've actually relied on the katan library. It looks to be possible to implement the IO simply with a RowDecoder.
The parser is implemented on the Beam side, using opencsv: only the downstream mapper can be specified. So it would need a PR on Beam to allow to specify another parser.
I meant to leverage the decoding part of katan, with smth like
val thingMapper = new SnowflakeIO.CsvMapper[Thing] {
override def mapRow(parts: Array[String]): Thing = implicitly[RowDecoder[Thing]].unsafeDecode(parts.toSeq)
}
@RustedBones opened #5502
Hello here,
Apache Beam has Snowflake support, so it's possible to use it with:
However a proper scio integration would be great. I suppose derivating
Thing
toSnowflakeIO.CsvMapper
would need some first a PR in magnolify?I can work on it.