Closed GoogleCodeExporter closed 9 years ago
I should add that this has been confirmed to work as expected in Firefox and in
Chrome Beta/Canary with --use-gl=desktop also. I have tested on both ATI and
nVidia cards.
Original comment by moo...@gmail.com
on 31 Aug 2012 at 12:43
Writing to gl_PointSize isn't mandated, but I wouldn't say that makes it
optional. The reason it's not mandated is because it obviously has no meaning
for non-point rendering. However when you are rendering points you'd better
write to it, because otherwise the value is undefined!
So while strictly speaking you shouldn't be getting a yellow point when not
writing to gl_PointSize, it would be perfectly valid for an implementation to
render nothing at all, huge points, or even randomly sized points. The only way
to avoid such much worse undesired behavior, and also prevent ANGLE from
rendering yellow points, is to write to gl_PointSize.
Ideally ANGLE shouldn't render yellow points regardless of potentially
producing worse behavior than that for not writing gl_PointSize, but this turns
out to be quite complicated. The reason revision 1192 avoids the COLOR semantic
is because Direct3D 9 automatically assumes centroid sampling is desired for
this semantic. And centroid sampling on texture coordinates sometimes leads to
visible seams at polygon edges. Despite the fact that this is strictly speaking
completely within spec, even Google Earth suffered from this. Using the
TEXCOORD semantic fixes it, but it can't be used for point rendering because
that assumes you want automatically generated texture coordinates that vary
from 0 to 1 over the point sprite. Which is exactly why you're getting (0.5,
0.5, 0.0) at the center of the point when not making it clear that you're
rendering points by not writing gl_PointSize.
In theory we could wait until a draw command is issued to determine whether
points are being rendered so the correct semantic can be used. But this means
each GLSL shader could require two HLSL shaders (or two binary shaders). This
adds a fair bit of complexity, which in my humble opinion is not worth it,
especially considering that you really should write gl_PointSize when drawing
points, to ensure you won't get any worse behavior.
So I'm inclined to mark this as WontFix and just instruct everybody to write to
gl_PointSize as a workaround. Unless you strongly disagree with that resolution?
Original comment by nicolas....@gmail.com
on 5 Sep 2012 at 9:42
I don't think rendering nothing, huge points or randomly sized points would be
much worse behaviour. If anything it would be more helpful as it would make the
point size an obvious avenue of investigation, rather than the red herring
symptom of varyings reading as yellow. Potentially it would have saved me some
time tracking down this issue. The spec simply says that the initial value of
gl_PointSize and therefore the rendered point size is undefined if it is not
written to, this doesn't make it okay for it to trigger incorrect behaviour in
unrelated areas.
As we've both said, generating two variants of the shader is not a good
solution. As an alternative it's not impractical to detect at draw call time
that a vertex shader is being used which doesn't write to gl_PointSize.
Flagging up a warning or even rendering nothing (zero sized points) might save
some time for the next person who runs into this issue.
Original comment by moo...@gmail.com
on 6 Sep 2012 at 7:44
Skipping the draw call when point primitives are being rendered but the shader
doesn't write gl_PointSize sounds like an excellent suggestion. It's simple and
makes ANGLE spec compliant again.
Original comment by nicolas....@gmail.com
on 6 Sep 2012 at 1:23
As of r1277 this has been implemented.
Original comment by dan...@transgaming.com
on 17 Sep 2012 at 9:38
Original issue reported on code.google.com by
moo...@gmail.com
on 31 Aug 2012 at 9:22Attachments: