Closed MattBabbage closed 10 months ago
➤ Automation for Jira commented:
The link to the corresponding Jira issue is https://ably.atlassian.net/browse/SDK-3780
Hi @MattBabbage,
Apologies for the long delay, I haven't been working on this for a few months, I just happened to notice your message in my GH notifications. Wanted to drop you a quick message in case you're still stuck on this as I saw exactly the same error while testing.
If I remember correctly, the reason this was happening for me had nothing to do with the Ably Kafka Connector configuration, but actually the configuration of the client application sending the messages. That error suggests to me that the client application isn't configured to use a Kafka Connector value serialiser and is perhaps just packing raw JSON into the value payload? This is exactly what I was doing initially in the loadtest app in this repository. I fixed it by letting the Kafka client do JSON serialisation for me, which causes it to wrap the JSON payload in an envelope with some magic bytes at the beginning. That stack trace suggests the JSON Deserialiser you've (correctly) configured on the Ably Connector side isn't able to find the magic bytes it expects.
The changes I made in the Python loadtest app to fix this were:
Have you got any other connectors in your deployment reading the values successfully? Note that I expect it'll just be components in the Kafka Connect ecosystem that require this, because they need values and keys to be serialized in some known way to construct datatypes in the common Kafka Connect format. If you're writing your own consumer code generally, it won't be a problem as you know how to unpack your own keys/values.
Hope that helps!
Hi @MattBabbage,
Is the issue still relevant? I can confirm what @jaley said, it looks like problem on the client that sending the message.
Hi @MattBabbage,
Unfortunately, we are unable to proceed with resolving it as there were no specific details provided. If you encounter this issue again or have additional details to share, please feel free to reopen this issue with the necessary information. We are here to help and will be happy to assist you further.
Hi,
For reference I have little experience with Kafka Connect so this may be a simple issue. I am using the connector using the .zip upload method to Confluent Cloud. All works well until wanting to use 'Dynamic Channel Configuration' and reference subvariables within the channel. i.e. "channel": "chat-#{value.userId}".
The readme reference the following:
After following the guide and using the most common methods it doesn't seem to be functional? Here is the example configuration:
I am using the Confluent Schema Registry, and as such, I am not sending the Json Schema in with the message (value.converter.schemas.enable is false). This is seemingly recommended via the json connect documentation
this is the error message getting sent back:
This occurs when sending this message:
with this Json Schema in attached to the topic with schema registry using confluent cloud:
Any advice is much appreciated! Cheers, Matt