-
Using Spark 3.2.0 with Spark Structured streaming and trying to write to kafka with confluent schema registry via "za.co.absa" % "abris_2.12" % "6.0.0"
However, getting the exception below abou…
-
**Describe the bug**
When the source code contain one or more lines that are longer than 1911 characters (bytes?) then the `ut_coverage_html_reporter` crashes.
**Provide version info**
```
21.0…
-
Hello everyone,
I'm working in a project with Confluent with the Kafka Connect connector CDC Oracle and The connector serialized with io.confluent.connect.avro.AvroConverter. The problem is that in…
-
The commercial (databricks edition) allows to specify a key when writing Avro like: https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/avro-dataframe
Sadly, this was …
-
On spark I cannot load this great library in both spark 2.x and 3.x:
```
/usr/local/Cellar/apache-spark/3.0.0/libexec/bin/spark-shell --master 'local[4]'\
--packages org.apache.spark:spark-a…
-
Summary of request: Add a new organization to ROR
Name of organization: Albanian Academy of Sciences
Website: http://akad.gov.al/ash/
Link to publications: https://punctumbooks.com/titles/broken-…
-
### I'm trying to register schema in Confluent Kafka, but facing the following issue.
This code is being used on Azure Databricks, Runtime 9.1 LTS, scala 2.12, Spark 3.1.2
**Code:**
```
val reg…
-
使用的配置文件是configs/yolov3/yolov3_darknet53_270e_coco.yml,修改了两处:一是增大了worker_num: 2--->8,二是修改了anchors。训练一开始无异常,数个epoch之后突然报如下错误。请问是哪里出错了呢?
![image](https://user-images.githubusercontent.com/61613537/14415…
-
Hello!
I am currently facing the following issue:
1. We get avro records from a topic that we read with spark streaming (2.4.x)
2. One of the avro record contains some malformed byte array (the…
-
We have a deeply nested structure. { payload: { thisthing:{ ....
Abris ends up generating
type: record
name: payload
namespace: abc
fields [
type:record
name:thisthing
namespace:…