import dateparser
from datetime import datetime
# ~1.5GB of leaked memory after function finish
def hard_leak():
for i in range(3000):
# every call == -0.55MB of leaked memory
dateparser.parse('dasdasd', settings={'RELATIVE_BASE': datetime.utcnow()})
# ~27MB of leaked memory after function finish
def light_leak():
for i in range(3000):
# every call == -0.01MB of leaked memory
dateparser.parse('12.01.2021', settings={'RELATIVE_BASE': datetime.utcnow()})
After each calling of dateparser.parse new item is added to cache dictionaries:
PROBLEM: leak of memory.
After each calling of
dateparser.parse
new item is added to cache dictionaries:After 3000 calls we will found 3000 items in each of dictionaries, and we have lost few memory. We are forced to stop using this module.
SOLUTION: add a limit (
CACHE_SIZE_LIMIT
) for max items in caches.