-
We’re using Logstash to write event data to a Kafka queue. The consumer of this queue expects an Avro binary blob (or fragment, the terminology seems to differ sometimes). It seems however, that using…
-
## Summary
Here are some thoughts on how to decouple the Vulcan API from the Java Avro SDK (JAvro), opening the way to adding an alternative
backend that implements Avro directly. Previously we di…
-
Hi everyone,
I'm actually new in **linkedin/goavro** lib, and I'm trying to use the lib to encode a complex structure. I have the following code to encode a nested **User** structure:
```
pack…
-
I listen to kafka messages. Kafka messages are in avro format, and errors are reported when running.
```
avro-logstash_1 | [2019-12-13T15:38:13,239][INFO ][org.apache.kafka.clients.consumer.interna…
-
We wan't to use this lib to use logstash with schemas from our schema-registry. But when deserializing the data we get this error:
```
Avro::SchemaParseError: Error validating default for workday_…
-
We keep our Avro schemas in Confluent's [Schema Registry](http://docs.confluent.io/2.0.0/schema-registry/docs/index.html). It would be great if we could point the `schema_uri` to the registry's API, b…
-
### Apache Iceberg version
1.4.3 (latest release)
### Query engine
Trino
### Please describe the bug 🐞
In Trino 436 (Iceberg 1.4.3), `write.parquet.compression-codec` property is also b…
-
Hi.
Great project. I am using MATLAB 2020a. I am new to this so please excuse me if this is an obvious question.
Is it possible to read a snappy compressed avro file and if it is, how?
When …
-
Hello,
I am forced to uptake 5.0.0 alpha4 for Kafka input plugin with Kafka 0.10 support. Now I am running into a dependency problem with Avro codec. Apparently it has issues with logstash-core. See …
-
The spec states "A logical type is always serialized using its underlying Avro type so that values are encoded in exactly the same way as the equivalent Avro type that does not have a logicalType attr…