jcustenborder / kafka-connect-solr

Kafka Connect connector for writing to Solr.
Apache License 2.0
44 stars 28 forks source link

ERROR Failed to create job for config/httpsolr.properties #3

Closed DivyaYash closed 7 years ago

DivyaYash commented 7 years ago

@jcustenborder : Please help me... Here is my httpsolr.properties name=httpsolr topics=syslogkafka tasks.max=2 connector.class=com.github.jcustenborder.kafka.connect.solr.HttpSolrSinkConnector solr0.url=http://m:8983/solr/ solr0.topic=syslogkafka solr0.core.name=divtest01

PS: divtest01 is a collection name created on one of the Solr server within CDH cluster.

When I try to start the connector, getting below error.. /usr/bin/connect-standalone /etc/schema-registry/connect-avro-standalone.properties config/httpsolr.properties

[2017-04-07 10:03:24,321] ERROR Failed to create job for config/httpsolr.properties (org.apache.kafka.connect.cli.ConnectStandalone:88) [2017-04-07 10:03:24,334] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:99) java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid (use the endpoint /{connectorType}/config/validate to get a full list of errors)

jcustenborder commented 7 years ago

Can you try the latest version? This should be resolved?

cguzel commented 6 years ago

Hi, I have some problem.

bin/connect-standalone.sh config/connect-standalone.properties config/connect-file-source.properties config/httpsolr.properties

httpsolr.properties:

name=httpsolr
topics=connect-test
tasks.max=2
connector.class=com.github.jcustenborder.kafka.connect.solr.HttpSolrSinkConnector
solr.url=http://127.0.0.1:8983/solr
solr.topic=testcore
solr.core.name=testcore

connect-file-source.properties:

name=local-file-source
connector.class=FileStreamSource
tasks.max=1
file=test.txt
topic=connect-test

Error logs:

[2017-10-20 00:19:12,233] INFO Kafka version : 0.11.0.1 (org.apache.kafka.common.utils.AppInfoParser:83)
[2017-10-20 00:19:12,233] INFO Kafka commitId : c2a0d5f9b1f45bf5 (org.apache.kafka.common.utils.AppInfoParser:84)
[2017-10-20 00:19:12,236] INFO Source task WorkerSourceTask{id=local-file-source-0} finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:143)
[2017-10-20 00:19:12,236] INFO Created connector local-file-source (org.apache.kafka.connect.cli.ConnectStandalone:91)
[2017-10-20 00:19:12,237] ERROR Failed to create job for config/httpsolr.properties (org.apache.kafka.connect.cli.ConnectStandalone:89)
[2017-10-20 00:19:12,237] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:100)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.solr.HttpSolrSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}
    at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
    at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:97)
Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.solr.HttpSolrSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='0.11.0.1', encodedVersion=0.11.0.1, type=source, typeName='source', location='classpath'}
    at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:161)
    at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:341)
    at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:240)
    at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:157)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:94)
[2017-10-20 00:19:12,239] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
[2017-10-20 00:19:12,239] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:154)
[2017-10-20 00:19:12,242] INFO Stopped ServerConnector@5126c400{HTTP/1.1}{0.0.0.0:8083} (org.eclipse.jetty.server.ServerConnector:306)
[2017-10-20 00:19:12,247] INFO Stopped o.e.j.s.ServletContextHandler@283fe454{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:865)
[2017-10-20 00:19:12,248] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:165)
[2017-10-20 00:19:12,248] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:76)
[2017-10-20 00:19:12,248] INFO Stopping task local-file-source-0 (org.apache.kafka.connect.runtime.Worker:447)
[2017-10-20 00:19:12,268] ERROR Error while trying to seek to previous offset in file:  (org.apache.kafka.connect.file.FileStreamSourceTask:93)
java.io.IOException: Stream Closed
    at java.io.FileInputStream.skip(Native Method)
    at org.apache.kafka.connect.file.FileStreamSourceTask.poll(FileStreamSourceTask.java:90)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:163)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2017-10-20 00:19:12,268] ERROR Task local-file-source-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:148)
org.apache.kafka.connect.errors.ConnectException: java.io.IOException: Stream Closed
    at org.apache.kafka.connect.file.FileStreamSourceTask.poll(FileStreamSourceTask.java:94)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:163)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream Closed
    at java.io.FileInputStream.skip(Native Method)
    at org.apache.kafka.connect.file.FileStreamSourceTask.poll(FileStreamSourceTask.java:90)
    ... 8 more
[2017-10-20 00:19:12,269] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:149)
[2017-10-20 00:19:12,269] INFO Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1017)
[2017-10-20 00:19:12,272] INFO Stopping connector local-file-source (org.apache.kafka.connect.runtime.Worker:301)
[2017-10-20 00:19:12,272] INFO Stopped connector local-file-source (org.apache.kafka.connect.runtime.Worker:317)
[2017-10-20 00:19:12,272] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:156)
[2017-10-20 00:19:12,272] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:67)
[2017-10-20 00:19:12,272] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:176)
[2017-10-20 00:19:12,272] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:86)
[2017-10-20 00:19:12,272] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)
cguzel commented 6 years ago

My problem is fixed. I add plugin.path= to connect.standalone.properties .

But I have new problem as follow:

[2017-10-20 11:33:08,515] INFO (Re-)joining group connect-httpsolr (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:442)
[2017-10-20 11:33:08,523] INFO (Re-)joining group connect-httpsolr (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:442)
[2017-10-20 11:33:08,528] INFO Successfully joined group connect-httpsolr with generation 5 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:409)
[2017-10-20 11:33:08,528] INFO Successfully joined group connect-httpsolr with generation 5 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:409)
[2017-10-20 11:33:08,529] INFO Setting newly assigned partitions [] for group connect-httpsolr (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:262)
[2017-10-20 11:33:08,530] INFO Setting newly assigned partitions [connecttest-0] for group connect-httpsolr (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:262)
[2017-10-20 11:33:08,554] ERROR Task httpsolr-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerSinkTask:482)
java.lang.UnsupportedOperationException: connecttest:0:0 has an unsupported type for a value. Only Struct or Map are supported.
    at com.github.jcustenborder.kafka.connect.solr.SolrInputDocumentBuilder.build(SolrInputDocumentBuilder.java:78)
    at com.github.jcustenborder.kafka.connect.solr.HttpSolrInputDocumentBuilder.build(HttpSolrInputDocumentBuilder.java:36)
    at com.github.jcustenborder.kafka.connect.solr.SolrSinkTask.put(SolrSinkTask.java:60)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:464)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:265)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:182)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:150)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2017-10-20 11:33:08,555] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerSinkTask:483)
[2017-10-20 11:33:08,556] ERROR Task httpsolr-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:148)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:484)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:265)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:182)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:150)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2017-10-20 11:33:08,556] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:149)
[2017-10-20 11:33:11,533] INFO Revoking previously assigned partitions [] for group connect-httpsolr (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:419)
[2017-10-20 11:33:11,533] INFO (Re-)joining group connect-httpsolr (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:442)
[2017-10-20 11:33:11,536] INFO Successfully joined group connect-httpsolr with generation 6 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:409)
[2017-10-20 11:33:11,537] INFO Setting newly assigned partitions [connecttest-0] for group connect-httpsolr (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:262)
[2017-10-20 11:33:11,542] ERROR Task httpsolr-1 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerSinkTask:482)
java.lang.UnsupportedOperationException: connecttest:0:0 has an unsupported type for a value. Only Struct or Map are supported.
    at com.github.jcustenborder.kafka.connect.solr.SolrInputDocumentBuilder.build(SolrInputDocumentBuilder.java:78)
    at com.github.jcustenborder.kafka.connect.solr.HttpSolrInputDocumentBuilder.build(HttpSolrInputDocumentBuilder.java:36)
    at com.github.jcustenborder.kafka.connect.solr.SolrSinkTask.put(SolrSinkTask.java:60)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:464)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:265)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:182)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:150)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2017-10-20 11:33:11,543] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerSinkTask:483)
[2017-10-20 11:33:11,543] ERROR Task httpsolr-1 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:148)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:484)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:265)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:182)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:150)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[2017-10-20 11:33:11,543] ERROR Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:149)
[2017-10-20 11:33:18,216] INFO Finished WorkerSourceTask{id=local-file-source-0} commitOffsets successfully in 9 ms (org.apache.kafka.connect.runtime.WorkerSourceTask:373)
jcustenborder commented 6 years ago

@cguzel What is your data type for the value? Basically this connector requires that the value be a map or a struct. Meaning you need to be using something like the AvroConverter or the JsonConverter. Which one are you using?

shruti090497 commented 1 year ago

Hi I too am facing the same issues, is yours resolved, but i am trying camel connector?