ponyorm / pony

Pony Object Relational Mapper
Apache License 2.0
3.65k stars 244 forks source link

[Q] How to disable caching? #193

Closed ghost closed 8 years ago

ghost commented 8 years ago

Hi guys,

I would like to disable caching, since it seems that PonyORM does not release memory. This raises issue for be - because I want to execute this function around 10~15 times per second (which is feasible).

Any tips or tricks for this?

Filename: /etc/loadtest/src/FileGenerator.py

Line #    Mem usage    Increment   Line Contents
================================================
    58  29.9609 MiB   0.0000 MiB       @Db.db_session
    59                                 @profile(precision=4)
    60                                 def generate_content(self):
    61  29.9609 MiB   0.0000 MiB           exporttime = datetime.utcnow().strftime(self.dt_format)
    62
    63  29.9609 MiB   0.0000 MiB           scheduledays = Db.select(sd for sd in Db.ScheduleDay).filter(lambda sd: sd.version != sd.exported_version).order_by(Db.ScheduleDay.day)
    64  29.9727 MiB   0.0117 MiB           print(len(scheduledays[:]))
    65  29.9727 MiB   0.0000 MiB           for scheduleday in scheduledays:
    66  29.9727 MiB   0.0000 MiB               scheduleday.exported_version = scheduleday.version
    67  29.9727 MiB   0.0000 MiB           del(scheduledays)
    68  29.9727 MiB   0.0000 MiB           pass
amalashkevich commented 8 years ago

Hi Mikki!

Try using the strict mode. Does it help?

ghost commented 8 years ago

That helps a lot! I sometimes still see that memory is kept 'reserved'... For now it saves quite some +1 Any other hints? :)

ghost commented 8 years ago

An example of where I still see magic memory :) The logic is that it first creates an event (create_new_event), and then exports it as xml (generate_content). In the generate_content function, you see that the same amount of memory is allocated to the process, even if the previously allocated memory (line 101) is in scope of the function (and therefore, because of line 94, in scope of the transaction).

Filename: loadtest_py2.py

Line #    Mem usage    Increment   Line Contents
================================================
    94  30.3711 MiB   0.0000 MiB       @Db.db_session(strict=True)
    95                                 def create_new_event(self, channelid):
    96                                     """ A part of the magic. Create an event.
    97                                     """
    98
    99  30.3711 MiB   0.0000 MiB           channel = Db.Channel[channelid]
   100
   101  30.3750 MiB   0.0039 MiB           latest_event = Db.max(se for se in Db.max(sd for sd in channel.scheduledays).events)
   102
<...>
   112  30.3750 MiB   0.0000 MiB           scheduleday = Db.ScheduleDay.get(channel=channel, day=scheduleday_datetime)  # @UndefinedVariable
<...>
   125
   126  30.3750 MiB   0.0000 MiB           del(scheduleday)
   127  30.3750 MiB   0.0000 MiB           del(channel)

Filename: /etc/loadtest/src/FileGenerator.py

Line #    Mem usage    Increment   Line Contents
================================================
    58  30.3750 MiB   0.0000 MiB       @Db.db_session(strict=True)
    59                                 @profile(precision=4)
    60                                 def generate_content(self):
    61  30.3750 MiB   0.0000 MiB           exporttime = datetime.utcnow().strftime(self.dt_format)
amalashkevich commented 8 years ago

Mikki,

Pony caches the AST of a query and the result of the translation to SQL. That is why a small increase in memory consumption is ok. The question is if the mem usage is increasing in a loop every time, or only on first query?

ghost commented 8 years ago

I now see that there is a massive increase in memory usage! (joke ;) ) It used to be 1gb/hour, and it now is 3.3mb/hour - measured over 1.5h.

So I still see a slight increase but it's not as aggressive as it was! I will close the ticket for now - thanks for your help!