sghaskell / kafka-splunk-consumer

PyKafka consumer to push events to Splunk HTTP Event Collector
MIT License
17 stars 8 forks source link

Question: Digest all topics? #5

Closed jryberg closed 6 years ago

jryberg commented 6 years ago

Hi,

Is there any option do digest all topics? I have tried to use both and . but that makes the system to crash

Process worker-0:
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
Process worker-1:
Traceback (most recent call last):
    self.run()
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python2.7/site-packages/kafka_splunk_consumer-0.6b0-py2.7.egg/EGG-INFO/scripts/kafka_splunk_consumer", line 79, in worker
    self.run()
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python2.7/site-packages/kafka_splunk_consumer-0.6b0-py2.7.egg/EGG-INFO/scripts/kafka_splunk_consumer", line 79, in worker
  File "build/bdist.linux-x86_64/egg/kafka/client.py", line 117, in __init__
  File "build/bdist.linux-x86_64/egg/kafka/client.py", line 117, in __init__
    self.client = KafkaClient(hosts=','.join(self.brokers), ssl_config=self.kafka_ssl_config)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/client.py", line 142, in __init__
    broker_version=broker_version)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 215, in __init__
    self.client = KafkaClient(hosts=','.join(self.brokers), ssl_config=self.kafka_ssl_config)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/client.py", line 142, in __init__
    broker_version=broker_version)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 215, in __init__
    self.update()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 501, in update
    metadata = self._get_metadata()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 305, in _get_metadata
    'Unable to connect to a broker to fetch metadata. See logs.')
NoBrokersAvailableError: Unable to connect to a broker to fetch metadata. See logs.
    self.update()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 501, in update
    metadata = self._get_metadata()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 305, in _get_metadata
    'Unable to connect to a broker to fetch metadata. See logs.')
NoBrokersAvailableError: Unable to connect to a broker to fetch metadata. See logs.
Process worker-2:
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python2.7/site-packages/kafka_splunk_consumer-0.6b0-py2.7.egg/EGG-INFO/scripts/kafka_splunk_consumer", line 79, in worker
  File "build/bdist.linux-x86_64/egg/kafka/client.py", line 117, in __init__
    self.client = KafkaClient(hosts=','.join(self.brokers), ssl_config=self.kafka_ssl_config)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/client.py", line 142, in __init__
    broker_version=broker_version)
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 215, in __init__
    self.update()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 501, in update
    metadata = self._get_metadata()
  File "/usr/local/lib/python2.7/site-packages/pykafka-2.8.0.dev1-py2.7-linux-x86_64.egg/pykafka/cluster.py", line 305, in _get_metadata
    'Unable to connect to a broker to fetch metadata. See logs.')
NoBrokersAvailableError: Unable to connect to a broker to fetch metadata. See logs.

It "works" if I just put any kind of topic such as "test", it does not find any data since we don't have that topic but at least the container keep running

Best regards Johan

sghaskell commented 6 years ago

Hi @jryberg

Currently, it's not possible to consume all topics in the config. It currently only supports a single topic per instance. You need to explicitly configure a topic that exists and is actively being published to if you want to see data in Splunk.

jryberg commented 6 years ago

Thanks for your reply.