Open ckaran opened 1 year ago
Thanks for reporting this. This is a known limitation on GPU: if too many layers overlap in one pixel, it will create artefacts. Right now, we don't have a way to record this, but we should add a flag and simple error-out when it happens.
How many layers can overlap?
The number of layers that can overlap inside of a tile is not limited, only the layer that cross tile borders. These are limited by the queue that's passed from one layer to another. This is currently 128, but I think it makes sense to try and spill to global memory as well, e.g. 4096. Even in that case, we might still have cases where that limit is reached, so we will still need a way to report back that the render is incorrect.
Thank you for the explanation. I agree that it would make sense to report back a rendering error as it may not always be obvious when it could occur (e.g., if stuff is moving around the screen randomly). Do you know what the maximum queue length can be? Or if there is a way of dynamically checking and updating it? My thought is that if Forma can probe the GPU to determine what the maximum queue length is at program startup, then we can adjust our code using info that Forma provides (if that makes sense).
For now, the queue will always be 128. Even when not enough memory is available, the shader should be able to simply spill this to global memory. Once the error reporting is put in place, that should report back the maximum number. However, keep in mind that this number is quite hard to reach: even with very transparent objects, not much is comprehensible after so many blends.
You're right that 128 layers is a bit much. My concern was just about the artifacts that are produced when things are moving around randomly, like if you're doing something like this, or if you're rendering particles for a physics simulation of ideal gas particles in a box (simulation is 3D, rendering uses sprites or forma and orthographic projection as a quick & dirty visualizer). Not exactly a major concern of yours, I know, but that is the kind of stuff I was thinking about.
That makes sense. This is definitely a use case we care about. I think the global memory spill approach would basically solve this issue almost completely and we should focus on it.
Sounds good to me; I don't have a graphics background, so I have to trust your judgement on this.
I just ran across forma and decided to see how well it performs by running the circles demo with larger and larger numbers of circles. At 100,000 circles, things get weird. The CPU renderer worked perfectly, but the GPU (both high and low) had weird tearing artifacts (if you have someplace I can upload a video to, I can work to capture one for you so you can see what I'm seeing).
The commands I used were the following:
I'm not a graphics person, so I'm not sure if I'm doing something wrong, I'm just trying to use forma as an easy-to-use 2D vector drawing library. If I'm supposed to be doing something different, please let me know.
Meta