Closed lilgreenwein closed 7 years ago
@lilgreenwein Do you have any idea of what the content is being posted? It would be great to setup a repo.
Content is syslog messages in JSON format. An example (obtained via kafka-console-consumer):
{
"facility": "syslog",
"host": "XXXXXX",
"message": "Thu Jan 19 18:22:43 2017: main Q: origin=core.queue size=0 enqueued=39 full=0 discarded.full=0 discarded.nf=0 maxqsize=39 ",
"severity": "info",
"syslog-tag": "sawmill.stats",
"timestamp": "2017-01-19T18:22:43.691475+00:00"
}
I'm not using Schema Registry since the source data is in JSON and I want it forwarded to Splunk in JSON format, I have no need to convert to avro. From my connect-distributed.properties file:
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
And the config for this connector:
{
"connector.class": "io.confluent.kafka.connect.splunk.SplunkHttpSinkConnector",
"name": "sawmill-stats-splunk-sink",
"splunk.auth.token": "XXXXXX",
"splunk.remote.host": "localhost",
"splunk.remote.port": "9999",
"splunk.ssl.enabled": "true",
"splunk.ssl.validate.certs": "false",
"tasks.max": "5",
"topics": "sawmill_stats"
}
I'm not sure, but I believe the issue may be somehow related to the key / value converter. When I create a file sink with the same config, it works fine but the output is not in JSON:
{severity=info, host=XXXXXX, message=Thu Jan 12 21:53:06 2017: main Q: origin=core.queue size=0 enqueued=28 full=0 discarded.full=0 discarded.nf=0 maxqsize=39 , facility=syslog, syslog-tag=sawmill.stats, timestamp=2017-01-12T21:53:06.323303+00:00}
Thanks for the repro! I'll take a look.
@lilgreenwein Got a repro that works. I'll put in a fix tonight. Basically I'm expecting to always get a Struct. When something else shows up it gets angry.
Pull-4 has a repro.
@lilgreenwein Can you take a look at Pull 4? I removed the dependence on a value schema and add test cases. Maps, Structs, Numbers, Strings, and Booleans all can be passed in the connect record. This should cover your case pretty well.
Checked out issue-3 branch and deploying now...
That worked, messages are flowing now. Thanks Justin!
@lilgreenwein Wonderful! I'll merge that pull into master.
I'm trying to do a simple setup of this connector - basically just send a topic to a local Splunk heavy forwarder running HEC on port 9999. When I fire up Connect (or restart this connector) I'm getting the following exception: