-
Notifications
You must be signed in to change notification settings - Fork 671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarify behavior of POINTS draws when gl_PointSize is unassigned. #3370
base: main
Are you sure you want to change the base?
Conversation
These are guaranteed by the WebGL API to produce no output.
Note @jdashg @lexaknyazev @greggman : per today's WebGL concall. @lexaknyazev, would appreciate it if you'd pull this spec text (after review and confirmation here) into your own pull request which revises the associated tests. This PR can then be closed. Thanks for your help. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Happy to take this, I think!
I think my remaining nit here is that "undefined for later stages" means that the vertex shader is still run, but the values output are undefined. While indeed this usually isn't observable, if we draw with transform feedback, we should see writes into transform feedback, or at least an increment of the "used TF verts" counter. The draw still happens, it's just that it's valid to treat all verts as if they are at Infinity. We can safely skip rasterization, but TF happens before clipping, I believe! I think there is no way to emulate this GLES-compliant behavior except by literally restarting transform feedback to guarantee that we fake that the verts were drawn and received output from transform feedback. Now, I believe this is possible to do, but I'd rather not add this fragile complexity to our implementations for such a narrow usecase. I would rather deviate from the allowed GLES behavior, and like this spec change proposes, actually-skip (but still check for errors for) draw calls with POINTS without gl_PointSize. |
@kdashg just to clarify, are you happy with this spec modification as-is, or would you like to see additional changes to clarify the behavior of transform feedback? I agree that it's infeasible to make transform feedback work if implementations short-circuit draw calls of |
I'm happy with the proposed language in the PR. I believe it matches the deviation I hope for. |
FWIW, it looks like a number of our own tests run afoul of this new check when I add it to Firefox. (e.g. http://localhost:8000/sdk/tests/conformance2/transform_feedback/unwritten-output-defaults-to-zero.html?webglVersion=2&quiet=0&quick=1) This makes me cautious here, but I think I would like to try to make this change if the rest of the WG is onboard, and just have my finger on the lever to revert it. |
Here is our CI test run of the patch, with its failures: https://treeherder.mozilla.org/jobs?repo=try&author=jgilbert%40mozilla.com&selectedTaskRun=D8kKTit9TVmh-ThzYTRJzQ.0 |
I ran into this issue recently on a practical use case, when using a three.js override material on a scene that included points. On Windows undefined point size seemed to default to some small value, but on Mac M1 the rendering exhibited severe artifacts. I'm a bit concerned that the spec change proposed here would break some real transform feedback use cases, since using points without a defined size could be quite natural when using transform feedback for compute. Is there consensus that we'd want to do this spec change? Is setting a default point size a viable alternative? |
I recently confirmed with multiple other tests that point rendering with unspecified point size on Apple silicon behaves as if the point size is uninitialized, i.e., its value is practically random. |
These are guaranteed by the WebGL API to produce no output.