Closed solamanhuq closed 8 months ago
Is there any possible workaround for this issue while it isn't fixed?
Is there any possible workaround for this issue while it isn't fixed?
@brantian I mentioned a workaround in the description:
EDIT: It appears that if I pass in number of days since 1970-01-01, it will successfully serialize and save in storage write API.
So, if you take a string and convert it to an int (as in days), it will work as intended.
At the moment, if I use
adapt
to serialize JSON with a DATE field as a string, e.g.2023-01-01
, the serialization will convert it to an INT value0
. Converting it to an epoch will allow it to serialize, but then the write API fails it withCould not convert value to date.
.I'm not sure what the expectation is, but as of now there is an incompatibility between the client and the storage write API.
EDIT: It appears that if I pass in number of days since 1970-01-01, it will successfully serialize and save in storage write API.
bit unfortunate that the type is restricted; if strings worked for streaming insert it'd be nice to work for the storage write API. but all the same this is something we can live with.
Environment details
@google-cloud/bigquery-storage
version: adapt branchSteps to reproduce
adapt
andprotojs
to write JSON events to table:const events = [{ coolthing: 'test close goal', date: '2023-01-01', ts: '2023-01-01T06:24:56', }] const serializedRows = events.map(event => protoType.encode(event).finish() )
// error: date field is now 0 const deserializedRows = serializedRows.map(event => protoType.decode(event))
const stream = bigQueryWriteClient.appendRows( // open connection to stream )
// throws 'error' event stream.write({ //something soemthing serialized rows })