Closed jmeixensperger closed 3 years ago
We should look at replacing that the event object is all in memory. If someone does a large perday volume on an interval, we will have issues. We should probably change perdayvolume to use an in memory buffered io stream.
Which 'event object' are you referring to? The eventsDict that we compile? The send_objects that we see after replacing tokens? Not super experienced with these kind of optimizations, so trying to understand fully.
We should look at replacing that the event object is all in memory. If someone does a large perday volume on an interval, we will have issues. We should probably change perdayvolume to use an in memory buffered io stream.
Which 'event object' are you referring to? The eventsDict that we compile? The send_objects that we see after replacing tokens? Not super experienced with these kind of optimizations, so trying to understand fully.
Specifically the dictionary that has all the events - def send_events(self, send_objects, startTime): - the send_objects is currently built in memory with just a dictionary. Long term we should think about moving this to an in-memory file player. That will allow us to cache and move forward / backward with large items without creating a "mem leak". Or we can just tell our end users you need more ram ;P