mellinoe / veldrid-samples

Sample projects for Veldrid
https://mellinoe.github.io/veldrid-docs/
121 stars 49 forks source link

OpenGL preferred backend makes sample stop working #19

Open bootzin opened 4 years ago

bootzin commented 4 years ago

Just downloaded the repo, tried the GettingStarted project and everything worked fine. Then I changed it to simply add GraphicsBackend.OpenGL as the preferred backend when creating the graphics device. When I went to test it again the rectangle is not drawn anymore, and the screen just stays black. Any idea on why is this?

omegachysis commented 4 years ago

I'm seeing the same thing. I tested it on two different machines, and same result. Weirdly, RenderDoc is showing the correct output (with the rainbow quad), but my actual output is just black screen.

bootzin commented 4 years ago

Setting PreferDepthRangeZeroToOne to true solved it for OpenGL, and setting PreferStandardClipSpaceYDirection to true solved it for Vulkan.. Haven't looked deep into the source cause of this yet, but maybe something with how transformations are made in the sample?

omegachysis commented 4 years ago

Hmmm yes I just reproduced this and I am seeing the same thing you are. Setting PreferDepthRangeZeroToOne gets OpenGL working. What weirds me out is I do not remember this happening before. I might be wrong here, but from what I can see, in the minimal sample we set the depth coordinate (z) to 0.0 in clip space. It shouldn't make one bit of difference then whether our depth range is 0 to 1 or -1 to 1 as 0.0 will fit in both of them. Theoretically the preferred depth range when that flag is false is -1 to 1, so why is it not rendering?

I stumbled upon this because as of the latest update all my stuff is broken on OpenGL. I was hoping a solution to this would solve my issue but sadly it did not. I'm just going to use Vulkan for now until I figure out what is going on with my own project in OpenGL. EDIT: actually my issue is with uniform buffers not binding in OpenGL, totally different issue probably.

bootzin commented 4 years ago

The problem is that we're setting gl_Position to vec4(Position, 0, 1). This means that our Z value is 0, but when we render, OpenGL renders at Z value -1, so we can't see Z value 0.. That's why setting the preferred depth from 0 to 1 works. This also means that setting gl_Position to vec4(Position, -1, 1) also works for OpenGL (but breaks DirectX)

omegachysis commented 4 years ago

Okay I've been experimenting around for a while and I think your conclusion is correct but your reasoning is not. In OpenGL, the clip space for z ranges from -W to W of each vertex. Because we set the W component of gl_Position to 1, that means any Z value from -1 to 1 should not be clipped. But anything other than -1 seems to clip it in OpenGL. This has nothing to do with depth clipping (try turning depth clipping off, it still happens). This is actually because of the depth test against the depth buffer. Turn that off and the example works as expected. Another way around it is to attach a depth buffer, clear it, and proceed from there. It seems the reason simply changing the preferred backend breaks the example is because we never attach a depth buffer and the depth test kills the vertices. I still have not found the time to dig deep but I think this is what is going on.

In the end I would say to avoid confusion in the future it may be a good idea to disable the depth test in this sample and call it a day. But honestly this is kind of an expected complication that comes with varying behavior and it is covered pretty well by the documentation, it just still gets me be surprise.

BinarySpike commented 4 years ago

I could not get preferDepthRangeZeroToOne to set to true. When I passed a GraphicsDeviceOption it remained false.

Instead, I modified the vertex shader gl_position to:

gl_Position = vec4(Position, -1, 1);

And that worked.