evadne / LiveFrost

Real time blurring for iOS
http://blog.radi.ws/post/61836396624
Other
484 stars 39 forks source link

LiveFrost continues using CPU when there's no need to update the view #3

Closed ghost closed 10 years ago

ghost commented 11 years ago

The CADisplayLink responsible for updating the view continues refreshing the view on the main thread even when there's no need to update the view. It starts from the moment LFGlassView is moved to a superview, and doesn't stop until the LFGlassView object is deallocated.

That uses some significant CPU, consumes battery power and dispatches superfluous events on the main thread. This should be fixed.

While the obvious way would be to add -startAnimation and -stopAnimation callbacks, much like how apps used to handle OpenGL ES content starting in i(Phone )OS 3, that's not an elegant strategy. Same goes for a notification based solution. I'd rather automate this process as much as possible.

I'd like to see if there is a callback we can use that updates the view when the contents behind this particular view visually change.

If such a thing does not exist, we could add a documented protocol for developers to control animation in a future iteration.

synoptase commented 10 years ago

Any advice on how to handle this?

evadne commented 10 years ago

I do not know if any exists at the moment. @nglevin ?

ghost commented 10 years ago

@synoptase has a good point, I should say something about best practices or possible workarounds.

For 1.0.3, I made sure that the -refresh method doesn't get called when the LFGlassView isn't attached to a superview, or its bounds happen to have a width or a height of 0. That automatically takes care of the worst offenders of redundant CPU use.

Now what happens when you know that whatever is behind the LFGlassView isn't going to change, but you still want that blur, and you still need the view to be onscreen? Well, it's still going to call -refresh sixty times a second. That's still a bit redundant.

I've been reluctant to add -stopLiveBlurring and -startLiveBlurring methods, or a simple public boolean ivar to regulate live blurring. My gut feeling is that the instant those methods show up, most people are going to assume that they're required methods for use. I think a smarter API should keep the developer from ever thinking about that.

If somebody else were to add that functionality via a PR at this time, I would probably merge that into the develop branch without much fuss.

Let me describe what that functionality might look like:

For -startLiveBlurring, it would check to see that the LFGlassView's superview is present and check that its -bounds is not an empty CGRect. That's our "is it safe to blur" verification test. If those conditions are both true, the method would then subscribe the view for -refreshing. Then, regardless of if it passed or failed that test, the -startLiveBlurring method would also set a private boolean ivar to attempt live blurring again when adding and removing the view from the view hierarchy, to keep its behavior constant.

For -stopLiveBlurring, there would be no test, it would simply unsubscribe the view for -refreshing, then set a private boolean ivar to avoid blurring when adding/removing the view from the view hierarchy.

The last thing to do, then, would be to check that private boolean ivar in the methods -didMoveToSuperview, -setFrame: and -setBounds: to avoid accidentally kicking off the animation process again.

It would be even nicer if this was all wrapped around a single ivar property as an interface. Maybe even something as simple as @property (nonatomic, assign) BOOL liveBlurring where its overridden setter method would invoke -startLiveBlurring or -stopLiveBlurring as necessary, turning -startLiveBlurring and -stopLiveBlurring into private methods.

The last detail would to be make sure that this public ivar and its private equivalent are, by default, set to YES in every initializer method without calling the setter and getter methods. No self.liveBlurring = YES, just _liveBlurring = YES and to do the same for its private equivalent to avoid unwanted side effects.

I think that sounds about ideal, do you have any other thoughts on this?

Getting back to the "smarter API" part of the problem. Unfortunately, neither UIKit nor CoreAnimation seem to provide a solid, public API to programmatically determine if a specific object in the view or layer tree is in the process of being changed. The best thing that I can think of would be -presentationLayer on a given view's -layer ivar, but that certainly doesn't cover every possible case. I think the above solution might be as close to ideal as we can get in the present day.

The other, alternative solution would be to use -renderInContext: on the LFGlassView instance to get a static bitmap of the blurred view from the glass view itself, then do the CoreFoundation/UIKit dance to get that bitmap into a separate, brand new UIView or UIImageView to temporarily replace the LFGlassView so you have a static alternative that you can display in your app.

That workaround could be done in a custom category, but it's still not ideal. Even if we built a new API to do that for you, why generate a new UIView when all the data you need is already in that LFGlassView?

In the meantime, you'll still find that our view's CPU/battery usage is much less than that of most OpenGL games. I'm just picky about performance and API, and this is one moment where these two ideals clash.

If you need this functionality, just let me know. It could be some time before I add this myself since I'm well in the middle of getting a big project off the ground. (Of course, since I basically described the whole solution in this comment, I don't suppose that would take much time at all... :) )

ArtFeel commented 10 years ago

I implemented a similar logic in my application, but it is very kludge and ugly to make a Pull Request from that.

But this solution is very useful, in most cases you already know is your content change or not, so you can handle this. And in some cases, for optimization purposes, I manually disable laveBlurring after finishing all animations even if the content behind the LFGlassView can change. It is much easier than creating snapshot and replacing LFGlassView with UIImageView and vice versa if I need to show the reverse animation.

Also we can add property for tweaking frameInterval of CADisplayLink. I play with it, and I noticed that if you skip every other frame, users will not see the difference. But CPU usage decreases by almost 2 times.

ghost commented 10 years ago

@ArtFeel , @synoptase :

If you have a moment, please give the solution on the HEAD of the develop branch (1283e9cecfe3ee7944aeafbbb64576d27b851ff0) a try to see if it works for your needs. It's more or less what I described, with the ivar called liveBlurring.

I gave it a spin with an NSTimer object that was flipping between both states and it seems to work well. Let me know if you run into any issues.

If it looks good for the next week, we can tag and ship a 1.0.4 release.

I haven't added anything to change the frameInterval of the CADisplayLink yet. That will probably be coming next. I'm trying to decide if an "adjust framerate for each subscribed view" solution makes more or less sense than "adjust framerate for all subscribed views," and what that would look like.

synoptase commented 10 years ago

Thanks for this very detailed insight on how to implement this. I'll be definitely looking into it when i get some free time.

Keep up the good work, you guys really nailed it :)

ghost commented 10 years ago

@ArtFeel :

I added a frameInterval ivar to the LFGlassView that allows you to skip frames, just like its CADisplayLink counterpart in e40c5745811445adb5c7a68f68670fb69f263425. It's on the develop branch right now.

I've been successful in getting two small views to render at 30 fps on an iPhone 4 by setting frameInterval to 2. I've also tried odd combinations like 5 and 7 to make sure it works as intended... not that it looks even remotely good with those numbers.

This could have also been exposed for the CADisplayLink itself in LFDisplayBridge, but I expect that somebody out there is going to try and push this object pretty hard, see what they can do with multiple views working with view hierarchies of varying complexity.

If we do want to expose the frameInterval for the CADisplayLink later on, then the frameInterval for LFGlassView will be compounded onto the frameInterval set for LFDisplayBridge. Which makes sense to me, but I could be wrong. If you think there's still room to make this thing more human-understandable, let me know.

As it is, I'm considering if I want to make LFDisplayBridge its own project someday, with its own Cocoapod spec to boot. It's a nice object, and I've used a more complex variant of it in the past to handle rendering OpenGL content on a background queue. Not as flashy and marketable as a UIView that blurs stuff, but practical.

ghost commented 10 years ago

As all of the suggestions made have been implemented as features for 1.0.4 and 1.0.5, I'm closing this issue for now.

If there are any further suggestions or complaints regarding performance, feel free to reopen this issue and let me know.