Open mukund26 opened 6 years ago
I am having the same issue.. did you manage to make any headway?
The BigQuery table gets created however no data is inserted into the tables
@bob9 sadly i couldn't find any way to make it work except hard coding the table format for which i was working
in my case if you use transforms it give me the same issue
transforms=InsertField
transforms.InsertField.type=org.apache.kafka.connect.transforms.InsertField$Value
transforms.InsertField.timestamp.field=fieldtime
In the case of the transform, I suspect there might be issues because we're fetching schemas from the schema registry if you're transforming the message such that it no longer has the same schema, which it looks like your transform would do, then I think you'll have a schema mismatch when inserting into BQ.
I don't think there's an easy way to fix this right now, other than to manually modify the table schema in BQ once so that it matches the schema that reflects how you're transforming the data.
hi @criccomini thanks very much for your explanation. If possibile i would like to write a column timestamp with the value of ROWTIME of the kafka message
what is the best way to do it using your connector? thanks very much
If it's just for a small number of tables, I suggest just manually setting up the schema in BQ yourself. The way KCBQ works is that it lazily inserts messages into BQ, and only tries to do things with the schema if the inserts fail. If you manually set up the BQ tables to have the proper schema according to the rows you're trying to insert, there should be no problem.
Hi,i want to kcbq but is have one doubt that kcbq is just for insert data opration into big-query or also perform update operation in big query @criccomini
I am having this error when i send the update avro schema to bigQuery connector: