Closed dynamike2010 closed 7 months ago
This is not a connector issue. The errors are all happening before they reach the actual connector code. You should see the same thing happen with any other connector. This is an issue with your value.converter settings:
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://redpanda:8081",
"value.converter.schemas.enable": "false",
"value.converter.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicNameStrategy",
"value.serializer.auto.register.schemas": "false",
"value.subject.name.strategy": "topic"
You're better off reaching out to Confluent or Redpanda folks for help with this.
That said, I can give you some pointers based on my experience.
I suspect these are not the correct names for the configs: value.serializer.auto.register.schemas
and value.subject.name.strategy
: These should be prefixed with value.converter
usually.
In addition, you will want to check value.converter.subject.name.strategy
is actually io.confluent.kafka.serializers.subject.TopicNameStrategy
to make sure the registry-client resolves schemas correctly.
On registry machine I'm getting this (don't even know what is being sent and why?):
"POST /subjects/cars-value?normalize=false&deleted=true HTTP/1.1" 404 49 "-" "Java/11.0.22" 122 (io.confluent.rest-utils.requests)
This is a bit weird.
I'm not familiar with that route: POST /subjects/cars-value?normalize=false&deleted=true
I'm familiar with POST /subjects/cars-value?normalize=false
route.
Didn't know you could add deleted=true
to the end of that TBH. Doesn't make much sense to me 🤷
Might be worth digging into that more.
That's the point. At first when I was using Redpanda as SR it could be compatibility issue but once I switch to confluent SR, now both Connect and SR are from same "provider" so in theory it SHOULD work/be well tested. I encountered this &delete=true here https://github.com/redpanda-data/redpanda/issues/11912 as someone had similar problem but issue wasn't solved there.
OK. I was searching using this "&deleted=true" and found dozen of entries, one of them pointing to a bug that changes YOUR SCHEMA ;-) So it seems that it converts every field of type="string" so my schema need to be sth like this:
{
"type": "record",
"name": "car",
"fields": [
{
"name": "model",
"type": "string",
"avro.java.string": "String"
},
{
"name": "make",
"type": "string",
"avro.java.string": "String"
},
{
"name": "year",
"type": "float"
},
{
"name": "type",
"type": [
"null",
{
"type": "string",
"avro.java.string": "String"
}
],
"default": null
}
]
}
So it has nothing to do with IcebergSink cause the error happens between kafka connect and kafka registry.
Just make sure if you have this problem enrich your string type from:
{
"type": "string"
}
to
{
"type": "string",
"avro.java.string": "String"
}
I will try to find "official" response/articles on this and will paste here.
My setup:
Pushing sample messages (example with redpanda car schema, using rpk tool) from topic to tables using schema v1 - everything works fine - I can see parquet files in S3 no matter how many times I send a message. Key is String and value is AVRO - detailed config below, for example I send like this:
echo "{\"model\":\"rs6\",\"make\":\"audi\",\"year\":2021}" | rpk topic produce cars --schema-id=1 -k 1
and later I send like this:
echo "{\"model\":\"rs3\",\"make\":\"audi\",\"year\":2022,\"type\":{\"string\":\"sport\"}}" | rpk topic produce cars --schema-id=2 -k 2
In new version there new/optional field added to schema (schema ID==2, version==2) however when pushing messages with either --schema-id=2 or --schema-id=topic I always get on the Connect machine the following error/stack:
On registry machine I'm getting this (don't even know what is being sent and why?):
"POST /subjects/cars-value?normalize=false&deleted=true HTTP/1.1" 404 49 "-" "Java/11.0.22" 122 (io.confluent.rest-utils.requests)
Schema v1 - cars-value subject:
Schema v2 - cars-value subject:
Here is config for sink:
Current docker compose (I tried many variations):
What am I missing?
Thank you!