Open gwaramadze opened 3 years ago
hi @gwaramadze ! Thanks for the compliment! I'm the author of the parallel consumer project :) Let me know how I can help, or if you have any ideas for the PC client :)
https://github.com/confluentinc/parallel-consumer
@gwaramadze if we think there's enough demand for this functionality, we may look at porting the algorithms to librdkafka, to be inherited by the wrappers...
ideally we would provide a higher level class along the lines of what @astubbs made out of the box, but for now we don't.
i don't see a problem with your approach, though i might be missing something... be aware that any callbacks (statistics, rebalance etc) would happen on an executor thread. ideally you would only use one consumer as it is more efficient, but the logic would also be a lot more difficult (and in practice likely won't matter for most scenarios).
@astubbs @mhowlett Thank you for your answers.
That worked for me using confluent-kafka = 1.5.0 Thanks @gwaramadze I just had to change the last couple of lines to be
try:
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
except (KeyboardInterrupt, SystemExit):
log.info("Stream shutdown complete")
This code was very helpful in creating an async consumer. One thing that made it easier for me to unit test consumer
is to remove functools.partial
. This is only needed to pass in keyword arguments, but 0.1
is a positional argument.
loop = asyncio.get_running_loop()
try:
log.info(f"Starting consumer: {topic}")
while True:
message = await loop.run_in_executor(None, consumer.poll, 0.1)
Hello. Is there any progress/plans on this feature? The code the op posted works, but is inefficient and blocks/quantizes the program execution.
Hi, Did anyone try to use the code in real production environment and maybe share some insights? Thanks
Hi @ericnesschrw ,
async def consume(self) -> Any:
assert self.consumer_topics is not None
received = await self._event_loop.run_in_executor(None, self.consumer.poll, 0.1)
return received
Does not return anything, but
async def consume(self) -> Any:
assert self.consumer_topics is not None
received = self.consumer.poll(0.1)
return received
returns what is expected. I am profoundly confused as to why this might be the case. Debugging async app was notoriously hard for me, but this one just leaves me in the dirt... If you have any suggestions, I'd appreciate it. I'm running python 3.12 version.
Description
A bit of a context question: what is the most optimal consumption pattern if I have more than one topics, and possibly multiple partitions per topic to be handled by a single application instance? I feel like doing this is an anti-pattern:
Let's assume that the topic1 logic is a lightweight filter/repartition (discard most of the stream, rehash a new key and publish to topic2) and the topic2 logic is an IO. Seems, counterproductive, It's like maintaining a single queue to a grocery and a pharmacy. Now, how to optimize this?
I have read this mind-blowing blog post https://www.confluent.io/blog/introducing-confluent-parallel-message-processing-client/
I figure that the key-level parallelism is like a holy grail, not available in the Python world as of now. But first things first: a good enough step would be to shuffle each topic to a separate
Consumer
instance, hopefully withasyncio
rather thanmultiprocessing
.I have read though related issues: https://github.com/confluentinc/confluent-kafka-python/issues/185 and https://github.com/confluentinc/confluent-kafka-python/issues/100 and the famous blog post https://www.confluent.io/blog/kafka-python-asyncio-integration/ and several other resources I cannot comprehend now in my gazillion open browser tabs 😆
I have come up with a snippet of code that I am kindly requesting to review. This generally works in a local environment, I wonder what you think. Does this approach make sense or is it a disaster awaiting the moment I put some serious load there. Thanks in advance.
Checklist
Please provide the following information:
confluent_kafka.version()
andconfluent_kafka.libversion()
):('1.5.0', 17104896)
and('1.5.0', 17105151)
, respectively'debug': '..'
as necessary)