Open Ahdw opened 2 years ago
Found the same issue using Beam version 2.48 and 2.53 as well.
Same issue when writing BYTES BigQuery column into BigTable (Apache Beam 2.57) For context: I'm trying to write numbers into BigTable to be compatible with BigTable's Increments (big-endian). If BQ has INT field this template will write it as "10" (base10 string representation), if BQ field is BYTES, template write "java.nio.HeapBytesBuffer[pos=0 lim=8 cap=8]" as BigTable value
Related Template(s)
bigquery-to-bigtable
What happened?
In BigQuery there's column of BigNumeric datatype, with the value 99999999999999999999999999999999999.
This BigQuery-to-BigTable writes the column to BigTable without throwing any exception, however, when read from BigTable, the value becomes
java.nio.HeapBytesBuffer[pos=0 lim=32 cap=32]
I used
cbt read
and BigTable's Python library to read that value, and they are all same.Beam Version
Newer than 2.35.0
Relevant log output
No response