Closed GoogleCodeExporter closed 9 years ago
I was able to quickly verify what you report. Seems very unfortunate to me that
there is no way to free the
memory from the stream context resource. Using a new context seemed cleaner
code wise, but its obviously not
acceptable. I'll move to reusing a single context.
Original comment by donovan....@gmail.com
on 19 Oct 2009 at 4:29
Moved to reusing a get and post context instead of creating a new one for each
request in r21
Original comment by donovan....@gmail.com
on 9 Nov 2009 at 10:09
further fix in r22 (wrong stream context function used to set options)
Original comment by donovan....@gmail.com
on 9 Nov 2009 at 10:52
I am having a rather large performance issue which I think is related to this.
I am using the newest code however I think this is leaking. I am trying to
index between 6 and 10 million documents and even with a memory limit on php of
4 G, I get to maybe 1 million before it eats up the memory. I have tried doing
this in chunks of 100,000, 10,000, and 1,000 and it all just dies and seems to
be around this function.
Thoughts? better approaches?
Original comment by ave...@gmail.com
on 30 Aug 2010 at 3:51
Are you using the SVN version of the code? it now reuses a context. If you are
and are still seeing memory climb - then I'd check whether you're holding onto
documents somewhere. If that still doesn't work, then you could try breaking
the work into several processes.
Original comment by donovan....@gmail.com
on 30 Aug 2010 at 4:05
Original issue reported on code.google.com by
raspberr...@gmail.com
on 14 Oct 2009 at 12:34