ironm73 / pyv8

Automatically exported from code.google.com/p/pyv8
0 stars 0 forks source link

Garbage Collector not working #229

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
*What steps will reproduce the problem?*
1. Run the attached file minimal.py on a system with PyV8.
2. See the output - in my case there's ~300 MB of additional memory used after 
every test run.

{{{
class TestMemoryWithJSContext(unittest.TestCase):
    def test_python_memory_management(self):
        def inner():
            with JSContext() as ctx:
                log_memory_usage("before empty evals")
                for index1 in range(1000):
                    for index2 in range(10000):
                        ctx.eval("")
                log_memory_usage("after empty evals")
                JSEngine.collect()
        log_memory_usage("before JSContext memory tests")
        inner()
        JSEngine.collect()
        gc.collect()
        JSEngine.collect()
        gc.collect()
        log_memory_usage("after JSContext memory tests and gc")
        print "Py gc.garbage:", gc.garbage

class CardEngineTestSuite(unittest.TestSuite):
    def __init__(self):
        super(CardEngineTestSuite, self).__init__()
        self.addTests(unittest.TestLoader().loadTestsFromTestCase(TestPython))
        self.addTests(unittest.TestLoader().loadTestsFromTestCase(TestMemoryWithJSContext))
}}}

*What is the expected output?*
I'd like to see a way to actually collect garbage in PyV8. Nothing works for me 
so far. As you can see, of 320MB of garbage that's generated (noteworthy: only 
using empty evals `ctx.eval("")`), only 20MB gets collected, even after 2 
consecutive calls of JSEngine.collect and gc.collect, both when ctx is 
reachable and when it's not reachable anymore. Note that the attached file 
includes a test of the Python garbage collector that appears to be working 
correctly when not interacting with PyV8.

*What do you see instead?*
>> python minimal.py 
...
python minimal.py
...
.2014-03-28 21:41:34,198 before JSContext memory tests process 110 now uses 
14.1 MB resident
2014-03-28 21:41:34,199 before empty evals process 110 now uses 14.4 MB resident
2014-03-28 21:41:55,513 after empty evals process 110 now uses 348.8 MB resident
2014-03-28 21:41:56,926 after JSContext memory tests and gc process 110 now 
uses 322.3 MB resident
Py gc.garbage: []
.
----------------------------------------------------------------------
Ran 2 tests in 26.838s

OK
...
.2014-03-28 21:42:01,103 before JSContext memory tests process 110 now uses 
322.5 MB resident
2014-03-28 21:42:01,104 before empty evals process 110 now uses 322.5 MB 
resident
2014-03-28 21:42:25,714 after empty evals process 110 now uses 636.5 MB resident
2014-03-28 21:42:28,459 after JSContext memory tests and gc process 110 now 
uses 629.3 MB resident
Py gc.garbage: []
.
----------------------------------------------------------------------
Ran 2 tests in 31.532s

OK

*What version of the product are you using? On what operating system?*
PyV8 revision 557
built using setup.py and v8 revision 19632
Ubuntu 12.04, running inside a Docker container.

Original issue reported on code.google.com by mmueller...@gmail.com on 28 Mar 2014 at 9:54

Attachments:

GoogleCodeExporter commented 9 years ago
Update: I remember seeing warnings about dtrace missing when I was building v8. 
Could this make the difference? I've found this pyv8 binary maintainer's commit 
that seems to link the two issues together: 
https://github.com/taguchimail/pyv8-linux-x64/commit/aaae4b4c2cac88bc78a87e711c8
18bed7a6bccd6. 

Original comment by mmueller...@gmail.com on 29 Mar 2014 at 7:27

GoogleCodeExporter commented 9 years ago
I've tested it with this binary distribution now - same effect. My solution for 
now will be to monitor the PyV8 process and restart it on OS level when the 
memory consumption has become too high.

Original comment by mmueller...@gmail.com on 29 Mar 2014 at 9:32

GoogleCodeExporter commented 9 years ago
Since taguchimail's binary repo comes with a 'stable' branch (PyV8 429 / V8 
r10452), I had to try this out as well. This has been an improvement as in the 
empty evals at least didn't show a leak anymore - the whole application still 
leaks however, so I'm trying to isolate new test cases now. In other words, the 
reported behavior above must have come in somewhere between 429 and 557. 

Original comment by mmueller...@gmail.com on 2 Apr 2014 at 6:07

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
Update: Wrapping the JSContext inside the wrapper as posted below and using 
PyV8 r428 / V8 r10452 solves the memory leak issues for me. I have to say 
though that the garbage collection is relatively slow this way, so I have to do 
it in a background process.

Original comment by mmueller...@gmail.com on 10 Jun 2014 at 2:27

Attachments: