Closed lanterndev closed 11 years ago
Just happened upon this explanation, and something like this just might be the culprit where we use the $compile service, e.g. https://github.com/getlantern/lantern-ui/blob/e8e700d8/app/js/vis.js#L93
angular/angular.js@bd524fc4e also looks interesting but probably not relevant to our leak since ours occurs in LanternBrowser (the OS X WebView) and according to @leahxschmidt Safari too
@oxtoacart First hit for https://github.com/angular-ui/bootstrap/search?type=issues&q=leak turned up angular-ui/bootstrap#485, but looks like we already have a (better/more recent than https://github.com/jgrund/bootstrap/commit/09e464ea#L1R270) fix for that issue based on https://github.com/getlantern/lantern-ui/blob/de9be08f/app/lib/ui-bootstrap-tpls-0.5.0-SNAPSHOT.js#L1936
I've done some basic testing and found the following while running Lantern UI against the mock back end:
(observations based on use of Timeline tool in Chrome Developer Tools)
I also created a very light test page with a basic angular app, a single SVG element and a tooltip on the element. Based on this, I observed:
Bottom line, we think that the problem is probably related to tooltips. Given that we're seeing memory growth in a vanilla tooltip example (without use of fancy $compile function), what memory leak we have is likely just in the bootstrap tooltip code itself.
This doesn't seem super likely to me honestly. The memory leak I've seen repeatedly is just with running overnight with zero interaction from the client. Again that's just with WebKit, but presumably the same thing happens on Chrome. If you just run overnight you should see the same thing.
You're right, it looks like it's not just tooltips.
I let the UIs run overnight against both a real Lantern back-end and against the mock backend. The real-backend exhibited 0 memory growth. The mock backend example grew from 45.6 MB to 56.6 MB.
I have saved the before and after snapshots. A diff of the two shows that the majority of the retained memory is in arrays and closures. Random inspection of the individual items reveals a variety of things being retained, but one category that comes up a lot is _transition__ in SVGPathElement and event in d3_dispatch.
The main functional difference I see between the real and mock tests is that the real test seems to have no arcs indicating peering activity, while the mock example does. So, the next place I'll look is the arcs and any animation related to them.
I was able to speed up reproduction of this with the mock back-end by speeding up the setInterval calls in scenarios.js. I set the peers interval to 100ms and the countries interval to 5000ms.
One interesting thing that I saw is that when I set the intervals too small, the UI locked up and became unresponsive. I guess since we're using server-push, the UI is vulnerable to the server pushing too much too fast. We probably want to build in some protection/throttling on the client side to deal with this.
So, I took the animate-connection directive out of the
@skivvies I wonder if we would stand to benefit from bypassing Angular and using d3 more directly for the svg stuff, relying on Angular only to bind tooltip content. I suspect that whatever is going on here is due to some interaction between Angular scopes/watches and d3's own state tracking.
Nice! We actually used to have some throttling from the backend to make sure the backend didn't push updates too quickly, but that was taken out awhile back in
Thanks for the great work @oxtoacart. Very clever to reduce the update intervals in the mock backend to reproduce this faster!
I wonder if the problem might be this:
https://github.com/getlantern/lantern-ui/blob/a8a9a24fb0/app/partials/vis.html#L8
The animateConnection directive is applied to each path.connection
inside the ng-repeat="peer in model.peers"
.
And in: https://github.com/getlantern/lantern-ui/blob/a8a9a24fb0/app/js/vis.js#L145
The directive sets up a $watch function on each peer in model.peers
, but when a peer gets removed from model.peers
, its corresponding unwatch function is never called. So maybe these scopes are never getting $destroy
ed when the peer is removed from the ng-repeat, and its associated memory never gets garbage collected.
In the watchConnections directive we're already watching model.peers
. If we move the watching from the animateConnection directive in there, I wonder if it would help.
As for D3 vs. Angular, I've been experimenting with how to best structure things over the last while and recently rewrote the vis code to be more idiomatic of D3 Angular apps (e.g. use directives), to use D3 for what it's good at (e.g. heavy SVG DOM manipulation (e.g. adding the countries paths)) and Angular for what it's good at (e.g. binding DOM to model change listeners), to be more maintainable, and at the same time more featureful. There are some remaining places where Angular is the one manipulating DOM, but only where there is no negative impact on responsiveness, and where structuring the code otherwise would be more error-prone. I'd love to keep improving this though, so if anything specific jumps out at you, let's jam on a hangout some time.
I like the idea of the frontend doing its own throttling regardless of whether the backend does. But since the frontend can keep up with the pace of updates it's currently getting without issue, I wonder if we should wait on this.
@skivvies - thanks for giving me a great introduction to the UI yesterday, it's a really neat interface!
I agree that we don't need to worry about the throttling right now - I've opened an issue to track that separately (#60).
On the question of how much d3 to use versus how much Angular, let's definitely chat about that. I've found that d3 is actually really good at handling model updates, and that expressing the SVG-related stuff in pure d3 tends to be very readable and possibly less hazard prone than intermingling d3 and Angular in a nested ng-repeat. It's probably largely because of my newness to Angular, but I'm not entirely clear on what it's buying us in this particular case.
One way to approach this would be to implement a render() method that does all of the d3 stuff, expressing insertions and deletions using the idiomatic d3 enter(), update and exit() selections. We can then call this render() method either from a watch on the model or from an interval timer. Using the timer would give us natural throttling of the updates, which would be nice. I think I'll throw something quick together to show how this would work and see what impact it has on performance.
My pleasure @oxtoacart, great to work on it together!
Thanks for opening #60, glad to get that in the tracker.
Looking forward to seeing your sketch of moving more of the Angular code into D3. Let me know if any questions come up as you have at. :)
@skivvies - my sketch of more d3 heavy version of this is in the "memory-leak" branch. The big changes are:
I'm not actually convinced that this will help the memory leak problem, though CPU usage looks like it might be a hair lower (not sure about that either though).
I'm going to let it run overnight and see what happens.
Incidentally, the accelerated back-end test was a little sketchy - if the back-end pushes update faster than our animation duration on the front-end, we end up stacking up animations, which would certainly cause memory growth, but is unlikely to be the leak that folks have encountered in the field.
So, the change seems to have dramatically slowed the memory leak, but not fixed it entirely. Last night, memory grew by .4 MB instead of 11MB. That's a heck of a lot better though, so I'm going to issue a pull request.
Interesting. The one thing that jumps out here is how tiny the leak you're seeing is. The leak I was seeing was about 1gb every night! I wonder if we're talking about the same leak? I also wonder. What's different on my system. More friend connections maybe?
On Saturday, August 24, 2013, oxtoacart wrote:
So, the change seems to have dramatically slowed the memory leak, but not fixed it entirely. Last night, memory grew by .4 MB instead of 11MB. That's a heck of a lot better though, so I'm going to issue a pull request.
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23207003 .
Adam pgp A998 2B6E EF1C 373E 723F A813 045D A255 901A FD89
@myleshorton Great question. Would you mind grabbing the memory-leak branch of lantern-ui and seeing if and how much that helps in your environment? Also, if you wouldn't mind taking a before and after heap snapshot and saving those off, that would be very helpful.
https://developers.google.com/chrome-developer-tools/docs/heap-profiling#basics_snapshot
Will definitely do this when I'm back online. I was also testing only through webkit and never tried it in Chrome.
On Saturday, August 24, 2013, oxtoacart wrote:
Great question. Would you mind grabbing the memory-leak branch of lantern-ui and seeing if and how much that helps in your environment?
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23210007 .
Adam pgp A998 2B6E EF1C 373E 723F A813 045D A255 901A FD89
Ahh, I never tested in WebKit. I will try that and see if I can see any differences.
On Sun, Aug 25, 2013 at 4:21 PM, myleshorton notifications@github.comwrote:
Will definitely do this when I'm back online. I was also testing only through webkit and never tried it in Chrome.
On Saturday, August 24, 2013, oxtoacart wrote:
Great question. Would you mind grabbing the memory-leak branch of lantern-ui and seeing if and how much that helps in your environment?
— Reply to this email directly or view it on GitHub< https://github.com/getlantern/lantern-ui/issues/59#issuecomment-23210007> .
Adam pgp A998 2B6E EF1C 373E 723F A813 045D A255 901A FD89
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23235810 .
Cheers, Ox
"I love people who harness themselves, an ox to a heavy cart, who pull like water buffalo, with massive patience, who strain in the mud and the muck to move things forward, who do what has to be done, again and again."
-----BEGIN PGP PUBLIC KEY BLOCK----- Version: OpenPGP.js v.1.20130712 Comment: http://openpgpjs.org
xsBNBFH+6NsBCACxewNAusfWlDC9ORPr9Rvt8N4pl8y0w2R33lIuDRVYDCOp aROnCGK0+nLeG8AZCpktvOoApRYuVH8GL+Fk/6Nn4XBfE2abM7jGwtfjV2f2 icLmwaGy8t6jkVDcv+yuflrhixFvSZuJzn6dY9b7c4jvT013JsTzs7cNwGac gCOLHun/xNuAlD5DfmqwDf7h88B75Dth1s3XMj6LDoiGVBXq8LuMm87hHreS /5CSREkeR81ihRXzfkujUxkGaURv7b1egop6BSPJEtwNuaw84OrDT6duSzBa a5KoC/8WTYBEdkcNULQfULo+ZR6VdLCusNSbARz8/Dk7IBHRFcY5B/PfABEB AAHNG094IENhcnQgPG94QGdldGxhbnRlcm4ub3JnPsLAXAQQAQgAEAUCUf7o 3gkQIbFNZUhWmE4AAMnCCACh7FbYaEsx1Z7WIuzEgyrKaRS4k9Fjw1BoAKAW dKOh+pkV/0r1qy2x603SOlqK7fFYwMuiPqpqF7AVQn2PchSGPEiO8d5LVJ4k aWJSI/r4gPddzl3nRKkBRFxLtDGyEKXUJAZLVa/H+727W64WjI4/nmUlHbJl G4ukKHSXXcZaptGk7+OALopEGJIpfinLOciauRE6vzJp3/TW8h7PAGtpaugB y2AFU319KmDACdluR7yQjMnoNKTHy/C/KAgBkNUt3kYL8HbGsx6ciKGwkd+w VTFsaS3G+05Qm1PYZVzDBggEdcLZ0xA5BqaN55PgzLdRvZGKFyjmPsgW1HhB WAwN =iaKQ -----END PGP PUBLIC KEY BLOCK-----
@myleshorton By "webkit" you mean https://github.com/getlantern/lantern/tree/master/LanternBrowser right? IIUC, as little more than a Cocoa WebView, LanternBrowser will use the system WebKit found in /System/Library/Frameworks/WebKit.framework/Versions/Current/ which varies from one version of OS X to another (and which I think is different even from the version of WebKit that a Safari installation on the same system uses, since Safari uses its own WebKit build rather than using the system one). @myleshorton, if that sounds right, can you report the version of WebKit in your /System/Library/Frameworks/WebKit.framework/Versions/Current/? If that's different from @oxtoacart's version, memory usage between the two might differ. And, of course, they also might not. :)
Right -- LanternBrowser is just wrapping WebKit. I'm not at that machine, but it seems unlikely the versions would differ much or that the memory usage bugs would be different.
On Monday, August 26, 2013, skivvies wrote:
@myleshorton https://github.com/myleshorton By "webkit" you mean https://github.com/getlantern/lantern/tree/master/LanternBrowser right? IIUC, as little more than a Cocoa WebView, LanternBrowser will use the system WebKit found in /System/Library/Frameworks/WebKit.framework/Versions/Current/ which varies from one version of OS X to another (and which I think is different even from the version of WebKit that a Safari installation on the same system uses, since Safari uses its own WebKit build rather than using the system one). @myleshorton https://github.com/myleshorton, if that sounds right, can you report the version of WebKit in your /System/Library/Frameworks/WebKit.framework/Versions/Current/? If that's different from @oxtoacart https://github.com/oxtoacart's version, memory usage between the two might differ. And, of course, they also might not. :)
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23264718 .
Adam pgp A998 2B6E EF1C 373E 723F A813 045D A255 901A FD89
Word. In the meantime, can you confirm the OS X version you were using, on the off chance we still observe a much smaller leak?
Whatever the latest 10.8.x is.
On Monday, August 26, 2013, skivvies wrote:
Word. In the meantime, can you confirm the OS X version you were using, on the off chance we still observe a much smaller leak?
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23300119 .
Sweet, thanks! That's what @oxtoacart and I are on, so one less variable.
Merged #61 to master yesterday and updated lantern to point to it today. So far I'm observing stable memory usage for Lantern.app (Lantern Browser) of around 350 MB real and 600 MB virtual. Will keep observing, and if anyone else can note their observations with the latest too, that'd be great.
Sweet! I'll run overnight.
On Tue, Aug 27, 2013 at 3:48 PM, skivvies notifications@github.com wrote:
Merged #61 https://github.com/getlantern/lantern-ui/issues/61 to master yesterday and updated lantern to point to it today. So far I'm observing stable memory usage for Lantern.app (Lantern Browser) of around 350 MB real and 600 MB virtual. Will keep observing, and if anyone else can note their observations with the latest too, that'd be great.
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23377570 .
Just to see what I'd find, I profiled the Lantern Browser binary. (We're using Automatic Reference Counting (ARC) for our code, but that wouldn't prevent leaks from reference cycles (though I don't see any in our code) or in library code we're calling.)
I fired up the mock backend with the update intervals cranked down (59da50c). Then I opened Lantern.xcodeproj, edited the scheme for the "Profile" run configuration to pass in the url of the running mock backend, hit Profile, and selected the "Leaks" template from the Memory category. Recorded it for a bit and here's what I got:
If I'm interpreting that correctly, a minuscule amount of memory (256 bytes) is initially leaked from the WebCore library, but no further memory is leaked.
This is new territory for me though so if anyone has more experience, please chime in.
New territory for me too. When I did this yesterday, I saw the same amount leaked. Interestingly, it was all recorded on the first sample. Subsequent samples never found any leaks.
[image: Inline image 1]
On Wed, Aug 28, 2013 at 10:53 AM, skivvies notifications@github.com wrote:
Just to see what I'd find, I profiled the Lantern Browser binary. (We're using Automatic Reference Counting (ARC) for our code, but that wouldn't prevent leaks from reference cycles (though I don't see any in our code) or in library code we're calling.)
I fired up the mock backend with the update intervals cranked down ( 59da50c https://github.com/getlantern/lantern-ui/commit/59da50c). Then I opened Lantern.xcodeproj, edited the scheme for the "Profile" run configuration to pass in the url of the running mock backend, hit Profile, and selected the "Leaks" template from the Memory category. Recorded it for a bit and here's what I got:
[image: allocations]https://f.cloud.github.com/assets/475147/1043207/9c16358e-0ff9-11e3-8b88-ad22a905501c.png
[image: leaks]https://f.cloud.github.com/assets/475147/1043208/a5cca374-0ff9-11e3-9384-dc364b7a6161.png
If I'm interpreting that correctly, a minuscule amount of memory (256 bytes) is initially leaked from the WebCore library, but no further memory is leaked.
This is new territory for me though so if anyone has more experience, please chime in.
— Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23425661 .
Cheers, Ox
"I love people who harness themselves, an ox to a heavy cart, who pull like water buffalo, with massive patience, who strain in the mud and the muck to move things forward, who do what has to be done, again and again."
-----BEGIN PGP PUBLIC KEY BLOCK----- Version: OpenPGP.js v.1.20130712 Comment: http://openpgpjs.org
xsBNBFH+6NsBCACxewNAusfWlDC9ORPr9Rvt8N4pl8y0w2R33lIuDRVYDCOp aROnCGK0+nLeG8AZCpktvOoApRYuVH8GL+Fk/6Nn4XBfE2abM7jGwtfjV2f2 icLmwaGy8t6jkVDcv+yuflrhixFvSZuJzn6dY9b7c4jvT013JsTzs7cNwGac gCOLHun/xNuAlD5DfmqwDf7h88B75Dth1s3XMj6LDoiGVBXq8LuMm87hHreS /5CSREkeR81ihRXzfkujUxkGaURv7b1egop6BSPJEtwNuaw84OrDT6duSzBa a5KoC/8WTYBEdkcNULQfULo+ZR6VdLCusNSbARz8/Dk7IBHRFcY5B/PfABEB AAHNG094IENhcnQgPG94QGdldGxhbnRlcm4ub3JnPsLAXAQQAQgAEAUCUf7o 3gkQIbFNZUhWmE4AAMnCCACh7FbYaEsx1Z7WIuzEgyrKaRS4k9Fjw1BoAKAW dKOh+pkV/0r1qy2x603SOlqK7fFYwMuiPqpqF7AVQn2PchSGPEiO8d5LVJ4k aWJSI/r4gPddzl3nRKkBRFxLtDGyEKXUJAZLVa/H+727W64WjI4/nmUlHbJl G4ukKHSXXcZaptGk7+OALopEGJIpfinLOciauRE6vzJp3/TW8h7PAGtpaugB y2AFU319KmDACdluR7yQjMnoNKTHy/C/KAgBkNUt3kYL8HbGsx6ciKGwkd+w VTFsaS3G+05Qm1PYZVzDBggEdcLZ0xA5BqaN55PgzLdRvZGKFyjmPsgW1HhB WAwN =iaKQ -----END PGP PUBLIC KEY BLOCK-----
Cool thanks. I unfortunately completely forgot to run this before leaving last night. It shows up fairly quickly though, so we should be able to reproduce it if it's still an issue at all.
On Wed, Aug 28, 2013 at 9:24 AM, oxtoacart notifications@github.com wrote:
New territory for me too. When I did this yesterday, I saw the same amount leaked. Interestingly, it was all recorded on the first sample. Subsequent samples never found any leaks.
[image: Inline image 1]
On Wed, Aug 28, 2013 at 10:53 AM, skivvies notifications@github.com wrote:
Just to see what I'd find, I profiled the Lantern Browser binary. (We're using Automatic Reference Counting (ARC) for our code, but that wouldn't prevent leaks from reference cycles (though I don't see any in our code) or in library code we're calling.)
I fired up the mock backend with the update intervals cranked down ( 59da50c https://github.com/getlantern/lantern-ui/commit/59da50c). Then I opened Lantern.xcodeproj, edited the scheme for the "Profile" run configuration to pass in the url of the running mock backend, hit Profile, and selected the "Leaks" template from the Memory category. Recorded it for a bit and here's what I got:
[image: allocations]< https://f.cloud.github.com/assets/475147/1043207/9c16358e-0ff9-11e3-8b88-ad22a905501c.png>
[image: leaks]< https://f.cloud.github.com/assets/475147/1043208/a5cca374-0ff9-11e3-9384-dc364b7a6161.png>
If I'm interpreting that correctly, a minuscule amount of memory (256 bytes) is initially leaked from the WebCore library, but no further memory is leaked.
This is new territory for me though so if anyone has more experience, please chime in.
— Reply to this email directly or view it on GitHub< https://github.com/getlantern/lantern-ui/issues/59#issuecomment-23425661> .
Cheers, Ox
"I love people who harness themselves, an ox to a heavy cart, who pull like water buffalo, with massive patience, who strain in the mud and the muck to move things forward, who do what has to be done, again and again."
- Marge Piercy
-----BEGIN PGP PUBLIC KEY BLOCK----- Version: OpenPGP.js v.1.20130712 Comment: http://openpgpjs.org
xsBNBFH+6NsBCACxewNAusfWlDC9ORPr9Rvt8N4pl8y0w2R33lIuDRVYDCOp aROnCGK0+nLeG8AZCpktvOoApRYuVH8GL+Fk/6Nn4XBfE2abM7jGwtfjV2f2 icLmwaGy8t6jkVDcv+yuflrhixFvSZuJzn6dY9b7c4jvT013JsTzs7cNwGac gCOLHun/xNuAlD5DfmqwDf7h88B75Dth1s3XMj6LDoiGVBXq8LuMm87hHreS /5CSREkeR81ihRXzfkujUxkGaURv7b1egop6BSPJEtwNuaw84OrDT6duSzBa a5KoC/8WTYBEdkcNULQfULo+ZR6VdLCusNSbARz8/Dk7IBHRFcY5B/PfABEB AAHNG094IENhcnQgPG94QGdldGxhbnRlcm4ub3JnPsLAXAQQAQgAEAUCUf7o 3gkQIbFNZUhWmE4AAMnCCACh7FbYaEsx1Z7WIuzEgyrKaRS4k9Fjw1BoAKAW dKOh+pkV/0r1qy2x603SOlqK7fFYwMuiPqpqF7AVQn2PchSGPEiO8d5LVJ4k aWJSI/r4gPddzl3nRKkBRFxLtDGyEKXUJAZLVa/H+727W64WjI4/nmUlHbJl G4ukKHSXXcZaptGk7+OALopEGJIpfinLOciauRE6vzJp3/TW8h7PAGtpaugB y2AFU319KmDACdluR7yQjMnoNKTHy/C/KAgBkNUt3kYL8HbGsx6ciKGwkd+w VTFsaS3G+05Qm1PYZVzDBggEdcLZ0xA5BqaN55PgzLdRvZGKFyjmPsgW1HhB WAwN =iaKQ -----END PGP PUBLIC KEY BLOCK-----
—
Reply to this email directly or view it on GitHubhttps://github.com/getlantern/lantern-ui/issues/59#issuecomment-23428069 .
quoting @oxtoacart:
[image: Inline image 1]
Was there an image there that didn't make it through the email-to-comment relay?
Memory usage is looking stable for me so I'm going to close this optimistically. Please reopen if you hit any further memory problems!
migrated from getlantern/lantern#850