Closed bajaj-varun closed 5 years ago
@bajaj-varun your configuration for the internal key and value converters is incorrect. In particular, the internal.value.converter
uses the AvroConverter, but there is no corresponding internal.value.converter.schema.registry.url
. Check your Connect worker logs for errors and warnings, because I'm surprised you got this far.
However, I strongly recommend you only use the JSON converter with schemas disabled for internal converters, configured as:
...
"internal.key.converter":"org.apache.kafka.connect.json.JsonConverter",
"internal.key.converter.schemas.enable":"false",
"internal.value.converter":"org.apache.kafka.connect.json.JsonConverter",
"internal.value.converter.schemas.enable":"false",
...
Better yet, if you're using Apache Kafka version 2.0 or later (or Confluent Platform 5.0 or later), there are defaults and you can remove these altogether from your worker configuration.
These are used by the Connect distributed worker to store the internal configs, status, and offsets inside Kafka topics. Your own data from the connectors is always read and written via the non-internal converters.
Note that if you cannot change these once you've started your Connect worker cluster. Just be aware that you very well may run into trouble in the future -- hopefully this is a development cluster that you can recreate.
@rhauch Many THX for the answer you contributed here.
@bajaj-varun let me add to that the following. irrespective of what @rhauch suggests I also see that you key seems to be a plain string. this would not work since the connector always expects the data to be parsable from valid JSON. very similar questions have come up in the past. e.g. read about potential solutions here #36 or #64.
please let me know if this helps to fix your problem.
Hi @rhauch , @hpgrahsl
Good morning. Hope you are doing great and Thanks for your response. Out of curiosity, is there any way possible to skip reading of "key" because this is part of simple POC where we just need to consume the message and dump to downstream without any modification in nested data ?
Hi @hpgrahsl your referenced solutions helped also we made key as null to skip the keys and the solution also worked as for POC we need only values. Thanks again for your help and marking the issue as close.
@bajaj-varun glad to hear you got it working for your use case! if you do more than the POC let me know. always happy to learn about yet another real-world story :) ah and also thx for closing this.
Data from source system - abc@abc.com# kafka-avro-console-consumer --bootstrap-server localhost:9092 --topic TTDF.TCDCPOC_DATA_TYPES --from-beginning --property print.key=true --property print.value=true --property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
^CProcessed a total of 6 messages
Schema registry -
Errors -
Config file -