optimizely / python-sdk

Python SDK for Optimizely Feature Experimentation and Optimizely Full Stack (legacy)
https://docs.developers.optimizely.com/experimentation/v4.0.0-full-stack/docs/python-sdk
Apache License 2.0
32 stars 36 forks source link

Overflowing Event Queue in BatchEventProcessor Only Longs a Debug Message, Maybe #352

Closed jkolenofferup closed 3 years ago

jkolenofferup commented 3 years ago

If you overflow the event_queue in BatchEventProcessor, the queue.Full event is caught and produces a debug level message in the logger.

        try:
            self.event_queue.put_nowait(user_event)
        except queue.Full:
            self.logger.debug(
                'Payload not accepted by the queue. Current size: {}'.format(str(self.event_queue.qsize()))
            )

This solution is problematic for at least two reasons:

  1. Data loss is not a debug level issue. It should be at least warn or maybe error.
  2. If the logger is not set, this code silently fails enqueuing the event. Unless the event source owns the queue, there is no way for it to know that operation failed.
Mat001 commented 3 years ago

Thank you for registering the issue @jkolenofferup . We are looking into it.

Mat001 commented 3 years ago

@jkolenofferup I implemented the warning log level. Unit test was tricky for Py3.4 and PyPy to pass. They required a large number of overflowing events to trigger the queue.Full exception while other versions of Py work fine with smaller number of events overflowing.

jkolenofferup commented 3 years ago

You could pass a small queue with max elements set to 10

event_queue = queue.Queue(maxsize=10) 
bep = event_processor.BatchEventProcessor(
            event_dispatcher,
            event_queue=self.event_queue)

You'd only need to send 11+ events to trigger the error. My guess is that default queue size is different for Py3.4 and PyPy.

Mat001 commented 3 years ago

You could pass a small queue with max elements set to 10

event_queue = queue.Queue(maxsize=10) 
bep = event_processor.BatchEventProcessor(
            event_dispatcher,
            event_queue=self.event_queue)

You'd only need to send 11+ events to trigger the error. My guess is that default queue size is different for Py3.4 and PyPy.

Yes, I tried that. All Py versions except 3.4, PyPy/PyPy3 would pass fine. The problematic three would consistently fail, they wouldn't even be flakey. As I increased the number of events to pass in from one over the queue limit to 20% more, to 100% more, to 1000% more, the tests for the Py 3.4, and PyPy was becoming more stable. Very strange. Feeding 1000 events and max queue size 10 seems to make the test very stable. I really don't know why that is, perhaps something specific about Py 3.4 and PyPy queueing/threading mechanism.