GoogleCloudPlatform / DataflowTemplates

Cloud Dataflow Google-provided templates for solving in-Cloud data tasks
https://cloud.google.com/dataflow/docs/guides/templates/provided-templates
Apache License 2.0
1.14k stars 951 forks source link

[Bug]: Writing BigNumeric from BigQuery to BigTable results in Java exception message #453

Open Ahdw opened 2 years ago

Ahdw commented 2 years ago

Related Template(s)

bigquery-to-bigtable

What happened?

In BigQuery there's column of BigNumeric datatype, with the value 99999999999999999999999999999999999.

This BigQuery-to-BigTable writes the column to BigTable without throwing any exception, however, when read from BigTable, the value becomes java.nio.HeapBytesBuffer[pos=0 lim=32 cap=32]

I used cbt read and BigTable's Python library to read that value, and they are all same.

Beam Version

Newer than 2.35.0

Relevant log output

No response

ee07dazn commented 6 months ago

Found the same issue using Beam version 2.48 and 2.53 as well.

ego157 commented 1 month ago

Same issue when writing BYTES BigQuery column into BigTable (Apache Beam 2.57) For context: I'm trying to write numbers into BigTable to be compatible with BigTable's Increments (big-endian). If BQ has INT field this template will write it as "10" (base10 string representation), if BQ field is BYTES, template write "java.nio.HeapBytesBuffer[pos=0 lim=8 cap=8]" as BigTable value