-
This issue tracks the documentation update needed for the merged PR #17759.
Source PR URL: https://github.com/risingwavelabs/risingwave/pull/17759
Source PR Merged At: 2024-07-22T09:35:52Z
-
Hello,
I'm trying to create a Kafka Integration (Confluent) with Avro as Data Format, but I'm keep having the same error.
My Terraform Rockset integration is working fine, I already created a Kafk…
-
Is it possible to add Decimal support in Avro format to kafkacat ?
Right now decimals are displayed as byte arrays.
-
I am unable to add cdc records to a snapshot.
environment
```
emr 7.2.0
AmazonCloudWatchAgent 1.300032.2,
Hive 3.1.3,
Spark 3.5.1,
Zeppelin 0.10.1
```
spark command
```
spark-shell …
-
We're registering Questdb with the Kafka connect via curl using the following:
```bash
curl -s -X PUT -H "Content-Type:application/json" http://localhost:8083/connectors/questdb-dietlytics/config …
-
While trying to configure AvroConverter for value converted as mentioned in details, we see the class not found error. Appears to be the confluent Avro converter is not with the connector package. Is …
-
When creating a new source connection through the web UI and selecting Avro as the data format with Confluent Schema Registry as the schema type, users can omit specifying the schema, as it is autom…
-
I write the mor table through spark, and the default generated files are parquet files.
I would like to know if there is any configuration to make spark write mor table with avor file?
-
It's needed for onehouse debezium transformations
https://github.com/debezium/debezium-examples/tree/main/tutorial#using-mysql-and-the-avro-message-format
-
**Background:**
Parquet doesn’t have a single canonical in-memory representation like Avro does; it’s a file format whose read/write layer allows the user to select the specific data format they’d li…