mapbox / mapbox-gl-native

Interactive, thoroughly customizable maps in native Android, iOS, macOS, Node.js, and Qt applications, powered by vector tiles and OpenGL
https://mapbox.com/mobile
Other
4.36k stars 1.33k forks source link

improve synced view tracking #1125

Closed incanus closed 8 years ago

incanus commented 9 years ago

General ticket for improvement of "sticking" native, Cocoa-side views to the panning, zooming GL render view.

This relates to the user dot now, later to callouts and annotation views. Let's keep a ticket around until we are 100% satisfied with the behavior here, but we can move it along milestones as we make staged improvements.

incanus commented 9 years ago

I'm feeling pretty good about the current state of tracking, especially after the above commit which avoids invalid CLLocationCoordinate2D-based updates during TransformState "will change" events (and just catches the "did change").

The one issue I'm seeing which I'd like to clean up is the dot disappearing during e.g. a double-tap animated zoom. It doesn't show up again until the next CLLocationManager update. It gets hidden because the CGPoint received out of -[MGLMapView convertCoordinate:toPointToView:] is well out of range of given screen bounds. I think this is just some sort of math error in TransformState during animations.

incanus commented 9 years ago

Another lead that confirms the above: pan the map until the dot is >150px off screen, then hit the user locate button. The map pans back into view (meaning the map centering to the valid user location is of course working) but the dot doesn't show up until the next CL update post-animation. During the animation it's still residing at the offscreen value, so it's hidden.

incanus commented 9 years ago

Tracking is pretty good right now. Only outstanding issue I'd like to hit for the beta is the dot disappearing briefly during animated zooms (but coming back at least, now). I'm nearing some completion on this but don't want to hold up a build tonight so I'll work on that a bit over the weekend.

Going to squash & merge a few commits now with improvements thus far.

incanus commented 9 years ago

First batch of improvements in https://github.com/mapbox/mapbox-gl-native/pull/1151.

incanus commented 9 years ago

Main blocker here now captured in the more general https://github.com/mapbox/mapbox-gl-native/issues/1160 bug with projection conversion.

kkaefer commented 9 years ago

FWIW, I think we should get rid of overlaying iOS views onto the Map and instead render everything ourselves. Apple recommends against doing this as well: https://developer.apple.com/library/ios/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/Performance/Performance.html#//apple_ref/doc/uid/TP40008793-CH105-SW8

For the absolute best performance, your app should rely solely on OpenGL ES to render your content.

incanus commented 9 years ago

Right, but I direct your attention back to https://github.com/incanus/AniAnnoDemo and what needs to happen to make this work :-) The thing I'm most uncertain about is syncing frames of animation from native views into GL. Maybe it's not as terrible as I'm imagining though.

mb12 commented 9 years ago

@incanus Perhaps, we should try posting the synchronization problem to Apple developer forums and/or use the 3 incidents that Apple includes in developer support. Apple is already doing this for Mapkit. Its a problem that they have already solved. It would make writing all kind of Map overlays super easy.

https://developer.apple.com/devforums/

incanus commented 9 years ago

http://stackoverflow.com/a/17096916/977220

In fact, you can't synchronize them using current APIs. MobileMaps.app and Apple Map Kit use private property on CAEAGLLayer asynchronous to workaround this issue.

I'm intending to dig deep when the time comes for this, but right now the user dot is syncing well and we just need the bug fix in https://github.com/mapbox/mapbox-gl-native/issues/1160 for a workable version right now.

From the POV of annotations, after shapes, this will be my next priority.

incanus commented 9 years ago

On this front, I've been thinking about looking into routes for capture a view's layer's presentationLayer at a given point in time and rasterizing it to a GL render layer. That is, all tracked views atop the GL view will actually get rasterized into a single "layer" in the GL. I have no idea how feasible this is, but is the sort of approach I've been mulling over researching.

incanus commented 9 years ago

We are ok here for b2 and b3 and beyond will feature much more robust point annotations.

friedbunny commented 9 years ago

Per WWDC and chat and Stack Overflow:

In iOS 9 CAEAGLLayer has presentsWithTransaction property that synchronizes the two.

incanus commented 9 years ago

As part of #1784, I'm looking into this again. While the iOS 9 API looks intriguing, we need a solid iOS 7 & 8 approach anyway.

Using what's in view-based-points works fairly well, however some lag is evident. Using the Time Profiler in Release configuration reveals our three main resource hogs:

  1. -[UIView setCenter:] at 7.5%. This is called across each annotation view to update its position every frame sync.

screen shot 2015-06-24 at 12 10 48 pm

  1. CALayer animation commit internals at 2.1%. This improves the smoothness of syncing at a slight cost.

screen shot 2015-06-24 at 12 11 13 pm

  1. GLKView internal GL rendering at 1.9%.

screen shot 2015-06-24 at 12 10 59 pm

Next steps are to learn from similar approaches for views/layers in our raster SDK and explore:

incanus commented 9 years ago

Tried a number of things in combination with reducing the number of UIView syncing calls (https://github.com/mapbox/mapbox-gl-native/commit/6987536957f5e5a3d8bf75307a6834ab891a6be6) such as:

Because of the reduction in the number of sync calls being made, we now have no theoretical lag between when a GL frame is drawn and when the annotation views' are updated, but we still see a ghosting or stuttering effect due to the asynchronous nature of GL frame updates in relation to a normal CATransaction.

Now going to try a perhaps-crazy approach of:

Refs:

Might be prohibitive, but if we could avoid constantly uploading the vertices, re-uploading unchanged textures, and doing anything with non-viewport relevant annotations, it just might work. We could also eventually improve this to handle alpha blended annotation views, intermingled z-ordering of native views and GL symbols, and the like.

Will start with a single annotation first to prove the concept.

mb12 commented 9 years ago

Does the issue get fixed on iOS 9 beta with presentsWithTransaction?

incanus commented 9 years ago

The good news: I got a quad billboard drawing (within the iOS side) that updates its position based on the associated annotation. No texture yet, just a dumb colored quad.

The bad news: It still has jitter, just like a native UIView. So even drawing in GL atop the map view post-renderSync() isn't enough to prevent this. Sort of makes sense; Core Animation is also compositing at this step, so same adverse result.

The next thing I'm going to try is setting up a mechanism for communicating between GL and iOS about which annotations have iOS-side textures that will be provided (i.e. rasterized CALayer objects), changing those from normal Symbol rendering to a special quad, and communicating over the boundary what the actual texture is upon CALayer update iOS-side, uploading the texture to the GPU and applying it to the quad.

incanus commented 9 years ago

I think this meshes well with the MapKit-style annotation view reuse method.

  1. Rasterize a view's layer and upload it as a texture, making a note of the "clean" status.
  2. As long as the view/layer doesn't change appearance, don't re-upload the texture, even when instantiating new, reused annotation views with the same identifier.
  3. Upon view/layer change for each annotation, mark "dirty" status, then re-upload the texture from a new rasterized snapshot as needed. This forks the number of textures across views with identical identifiers, but only when absolutely necessary. As long as the views aren't visibly changing, the same single texture can be reused across multiple annotations.
incanus commented 9 years ago

Encapsulating both the GL render calls and the UIView-based updates in a CATransaction block while using [(CAEAGLLayer *)self.glView.layer setPresentsWithTransaction:YES] seems to have no effect. Trying to figure out if I'm using it right.

incanus commented 9 years ago

I will push some code shortly to demo, but I am seeing some odd behavior.

Since renderSync is synchronous, I would think that in the first case it wouldn't matter that I'm drawing from two different threads, but that approach may be suspect.

Note that in the long run, I wouldn't be doing this sort of work on the main thread anyway so as to not impact UI responsiveness, so maybe I should just skip to an approach that draws the quad in the latter way anyway. I can next work on getting CALayer snapshots across the wire to use as textures for the quads.

incanus commented 9 years ago

It is also a possibility that GLKView is doing something smart around automatic buffer swapping based on calling thread.

incanus commented 9 years ago

A method to compare the two approaches:

incanus commented 9 years ago

I've confirmed that -[EAGLContext presentRenderbuffer] is only ever called by GLKView internals from the main thread, and is thus not called between the call to renderSync and any external drawing is done. I did this by subclassing EAGLContext to log the method.

2015-06-29 11:43:27.529 Mapbox GL[378:21959] => presentRenderBuffer
2015-06-29 11:43:27.537 Mapbox GL[378:21959] 1. start renderSync
2015-06-29 11:43:27.557 Mapbox GL[378:21959] 2. end renderSync
2015-06-29 11:43:27.559 Mapbox GL[378:21959] 3. start gl marker
2015-06-29 11:43:27.564 Mapbox GL[378:21959] 4. end gl marker
2015-06-29 11:43:27.566 Mapbox GL[378:21959] => presentRenderBuffer
2015-06-29 11:43:27.572 Mapbox GL[378:21959] 1. start renderSync
2015-06-29 11:43:27.594 Mapbox GL[378:21959] 2. end renderSync
2015-06-29 11:43:27.595 Mapbox GL[378:21959] 3. start gl marker
2015-06-29 11:43:27.599 Mapbox GL[378:21959] 4. end gl marker
2015-06-29 11:43:27.601 Mapbox GL[378:21959] => presentRenderBuffer
2015-06-29 11:43:27.607 Mapbox GL[378:21959] 1. start renderSync
2015-06-29 11:43:27.628 Mapbox GL[378:21959] 2. end renderSync
2015-06-29 11:43:27.630 Mapbox GL[378:21959] 3. start gl marker
2015-06-29 11:43:27.632 Mapbox GL[378:21959] 4. end gl marker
2015-06-29 11:43:27.633 Mapbox GL[378:21959] => presentRenderBuffer
2015-06-29 11:43:27.638 Mapbox GL[378:21959] 1. start renderSync
2015-06-29 11:43:27.658 Mapbox GL[378:21959] 2. end renderSync
2015-06-29 11:43:27.659 Mapbox GL[378:21959] 3. start gl marker
2015-06-29 11:43:27.662 Mapbox GL[378:21959] 4. end gl marker
2015-06-29 11:43:27.664 Mapbox GL[378:21959] => presentRenderBuffer
2015-06-29 11:43:27.674 Mapbox GL[378:21959] 1. start renderSync
2015-06-29 11:43:27.699 Mapbox GL[378:21959] 2. end renderSync
2015-06-29 11:43:27.700 Mapbox GL[378:21959] 3. start gl marker
2015-06-29 11:43:27.703 Mapbox GL[378:21959] 4. end gl marker
2015-06-29 11:43:27.703 Mapbox GL[378:21959] => presentRenderBuffer
incanus commented 9 years ago

Next up is comparing the TransformState between the use inside of renderSync and outside as provided by the getter. Perhaps the geography/pixel conversion routines differ.

incanus commented 9 years ago

Transform is not a factor as the transformed screen point for a given annotation is the same between iOS and GL. Checking next the height, width, and other factors that might vary (for some reason?) between frames.

incanus commented 9 years ago

Upload vertex positions are the same between core-side and iOS-side GL markers, but the lag persists:

anigif-1435595751

Black square is drawn inside of renderSync, green square is drawn in OpenGL immediately afterwards on the iOS side

I also checked that the vertex shaders were using the same basic procedure, as well as precision, and they are — just a simple:

attribute vec2 a_pos;
void main() {
   gl_Position = vec4(a_pos, 0.0, 1.0);
};

The core-side shader applies u_matrix, but in the case of these annotations, we pass the painter->identifyMatrix since we are doing the screen conversion ourselves.

One thing that does jump out at me in the above GIF is that finger-down drags seems to sync perfectly, but coasting/decelerated pans are where things get out of sync. Looking into that now.

jfirebaugh commented 9 years ago

One thing that does jump out at me in the above GIF is that finger-down drags seems to sync perfectly, but coasting/decelerated pans are where things get out of sync. Looking into that now.

That sounds telling. Please rebase on master and put the main-thread drawing here.

The Transform update for easing happens on the next line. In the code where your branch is currently, it happens on the rendering thread, so your code is racing against it.

incanus commented 9 years ago

:+1: Was closing out some hanging code from before the weekend; rebasing now to get up to speed.

incanus commented 9 years ago

Please rebase on master and put the main-thread drawing here.

FWIW this requires making the MapContext's painter public to Map so that we can use the shaders.

incanus commented 9 years ago

Disregard last comment. Just merged master.

Black square was custom GL drawing in MapContext, as a comparison with...

Green square, which was custom GL drawing in MGLMapView after calling Map::renderSync, which lagged.

After applying master, I just tried only the green square and the lag is gone. This is because we now drive rendering from the main thread, then do this custom GL code also in the main thread.

This should be enough for me to next:

  1. Test straight UIView / CATransaction, but I'm guessing we'll still see the problem that's purportedly solved by the new iOS 9 API.
  2. Regardless, move on to snapshotting CALayer and uploading as a texture in the MGLMapView custom GL code.
incanus commented 9 years ago

UIView tracking directly on the map view now works without jitter! Both the user dot and custom annotation views returned by the delegate track nicely.

Going to double-check that I'm not doing anything in particular in my branch by also testing master with the user dot, but if not, we can close here as solved by https://github.com/mapbox/mapbox-gl-native/pull/1624 and move back to building out the annotation view API in https://github.com/mapbox/mapbox-gl-native/issues/1784.

incanus commented 9 years ago

The user dot in master does behave smoothly and without jitter, but in both branches, I do notice a lag / movement of supposedly static markers during fast panning. I made sure that subtle user location changes weren't at fault by testing on my branch with a custom annotation view with a fixed coordinate.

Here's a device video capture: https://dl.dropboxusercontent.com/u/575564/floater.mov

Note how the fixed marker has varying locations, and you can see the subtle floating during a slo-mo playback of the video.

screen shot 2015-06-29 at 3 41 10 pm

initial location

screen shot 2015-06-29 at 3 40 59 pm

frame capture during fast panning

screen shot 2015-06-29 at 3 41 05 pm

another frame capture

So it's not just perception; it's really moving slightly. This may be what the OpenGL / CATransaction sync-up is about, in which case, we may still want GL-side snapshotting.

More understanding / definitely a better effect here after #1624, but...

incanus commented 9 years ago

The drift happens in main thread GL drawing code as well as just UIView syncing, so it's not the CATransaction thing.

We could live with the drifting if we had to for now, I think, but I'll keep digging on it a bit.

incanus commented 9 years ago

Procedure for replicating / testing the "green square hack" — a GL quad drawn in the main thread immediately after renderSync and inside of MGLMapView:

  1. Grab 1fb66068f35d5084315fdeb493191d5814773076.
  2. Center so that DC is visible.
  3. Add "100" annotations (adds just a single for now).
  4. Observe green quad lag/floating (but no jitter).
incanus commented 9 years ago

I can pretty much again isolate lag (and what used to be jitter) to the pan deceleration / animation routine upon end of a pan gesture in iOS. Keeping you finger on the screen prevents the lag from appearing.

incanus commented 9 years ago

Currently trying to isolate whether it's specific to Transform or specific to gestures by writing up some animated transition code that tries to replicate the lag without gestures being involved (so, still eventually calling startTransition).

incanus commented 9 years ago

Using a patch like https://gist.github.com/incanus/41d85a875b032558a6ec proves that it's related to animated transitions and is not specific to gestures. Show the green square, long press the map, and watch it drift.

incanus commented 9 years ago

@jfirebaugh nailed this exactly in chat:

renderSync does two things in this order: 1. render, 2. update transition for easing if you draw after both of those, it will be out of sync you need to draw in between those two things

This is why the black square demo worked fine — that's exactly where I placed the draw code.

Refactoring things into a split between rendering and updating transitions lets this work. The green square, drawn in iOS code, doesn't lag!

Working on cleaning that up.

mb12 commented 9 years ago

Does this explain the difference in syncing behavior between downward finger drag and upward finger drag?

incanus commented 9 years ago

I’m not familiar with that @mb12 — can you elaborate?

mb12 commented 9 years ago

I was referring to this comment on the gif animation.

https://github.com/mapbox/mapbox-gl-native/issues/1125#issuecomment-116754125

incanus commented 9 years ago

Breaking things into two parts also fixes syncing issues with straight-up UIView annotations.

@mb12 Yes, there is now no lag.

incanus commented 9 years ago

While this will certainly be one of the weirder animations posted here, bear with me a moment.

I have a prototype of the ability to sample an animating UIView and upload a snapshot of it to a texture to be rendered by OpenGL each frame. This points at the future possibility of truly using reuse identifiers for annotation views and uploading them as reusable textures. It remains to be seen whether this is a performance gain, but I'm guessing that it could have benefits over simply tracking UIView objects on top of the GL view directly, plus we could reuse annotation views (animating or not) to move things from duplication in the view hierarchy to a memory cost savings in GL textures.

anigif-1435732271

(Full-res video capture of this here)

What is happening here is the following:

  1. There is a normal UIView (center right) with an image as its layer.contents which has a repeating rotation animation applied to it. Consider this a potential annotation view created in the expected way.
  2. When frames are being redrawn, i.e. when the map is being moved, every frame we are sampling the view.layer.presentationLayer.contents into a CGImage.
  3. We are also sampling the view.layer.presentationLayer.transform.
  4. We use the image and the transform to draw into a CGContext, capturing the raw pixel array, which can then be uploaded as an OpenGL texture.
  5. That texture is drawn on a simple quad (the weird tracking image).

In practice, the actual view would be hidden so that we only see the OpenGL one, incurring no resource usage to render the original UIView.

The weird in-and-out-of-frame bit is because the animation rotation origin and the context transform point of reference aren't the same and could be figured out with a little trial and error, causing the texture to appear to rotate in place just like the native view instead of rotate in and out of the quad. Note that the rotation is backwards, too. But the general proof is there that a UIView can be sampled as a texture into GL frames, smoothly animating.

I think we should punt on animation annotation views for now, as well as this texture upload business, but it's promising for future use.

I'm next going to focus on just a bare-bones annotation view API (no actual reuse identifiers), which I should be able to wrap pretty easily now that we've solved the syncing issues.

incanus commented 9 years ago

We could further optimize by passing the view.layer.presentationLayer.transform into the shader as the projection matrix, avoiding the need to re-upload the texture during animations. Then we'd only have to sample the view to a texture after view.layer.contents changes (which should be KVO observable). That would also let us reuse the same texture for all views with the same reuse identifier, since the matrix would vary for each quad being drawn (depending on animation state) but the texture wouldn't.

incanus commented 9 years ago

Going the basic UIView route for now in #1784, but first we need to split the renderSync call per https://github.com/mapbox/mapbox-gl-native/pull/1813.

incanus commented 9 years ago

Per chat, @kkaefer is going to look into this & https://github.com/mapbox/mapbox-gl-native/pull/1813 to improve our current user dot.

incanus commented 9 years ago

I'm grabbing this for the b3 release branch.

incanus commented 9 years ago

We are good for release-0.5.0 now as of https://github.com/mapbox/mapbox-gl-native/commit/b8388168dd130c67c77254565cdb576df7905915 and https://github.com/mapbox/mapbox-gl-native/commit/a67e80219f90c409dfce8bfef91d9131dabd8d4f; moving off of milestone.

ljbade commented 8 years ago

Making note of #2384 as it looks like iOS 9 has some fixes that can be used.

incanus commented 8 years ago

We started looking at that in https://github.com/mapbox/mapbox-gl-native/issues/1125#issuecomment-111636947 @ljbade. No help currently.

1ec5 commented 8 years ago

3683 finally synchronizes the user dot with the map. Keeping this issue open until we’re able to leave callouts open while the viewport moves.