-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement depth support #19
Comments
Info about polygon offset calculation which i've found: About integer part of offset:
About decimal part of offset:
If offset < 0 then result = integer part - fractal part Test data:
|
TODO find out what is the place where polygon offset unit is stored. |
@PabloPL are you still working on it? |
@anarsoul If You want, be free to continue this. |
Also i had to add https://github.com/PabloPL/mesa-lima/commit/20ac8f025525bc3bc4b46cb68dcb0da5a9621d22 (so we can have format which supports depth). |
So looks like offset_scale is calculated like this:
According to limare, polygon_offset_units is used to adjust viewport->translate[2], but it does some weird cast from float to int and then just substracts polygon_offset_units from viewport->translate[2]. Unfortunately I don't understand how that's supposed to work. |
viewport transform will be applied to each vertex at the end of GP shader: |
@yuq you misunderstood me, I don't understand how this's supposed to work: https://github.com/limadriver-ng/lima/blob/master/limare/lib/limare.c#L848 Note that state->viewport_transform[6] is float. So basically the code casts float pointer to int pointer and then does some math on int. Least significant bits (22 to 0) of float contain fraction part, but the code doesn't check exponent value: https://en.wikipedia.org/wiki/Single-precision_floating-point_format |
Does anyone know why eglinfo shows now depth/stencil visuals?
|
You mean "no" or "now"? |
I meant 'no', but I figured that out - I added formats to lima_screen.c, but used stale libraries, so eglinfo showed no depth/stencil visuals. |
@yuq do you know how to attach depth/stencil buffer? Basically if I just enable depth test it works somehow, but I'm not sure where it stores Z values, because ctx->framebuffer.zsbuf isn't used anywhere |
One way I can think of is creating a gbm_bo and attach it to FBO's GL_DEPTH_ATTACHMENT. |
@yuq I'm talking about lima. See lima_set_framebuffer_state() in lima_state.c, framebuffer->zsbuf is only used in this function and not anywhere else. |
Sorry, I don't know either. The zsbuf I just write it there for reminder, haven't actually use it. But from the lima-ng, I also can't find a dedicated depth buffer attached for each draw. Maybe a reverse engineer dump is needed for this case. |
I checked the dump and I don't understand where it stores depth values. |
Could this possible:
So maybe you can try to dump a app using FBO GL_DEPTH_ATTACHMENT. |
@yuq you're right - mali400 doesn't require in-memory depth buffer, see http://www.highperformancegraphics.org/previous/www_2010/media/Hot3D/HPG2010_Hot3D_ARM.pdf page 9: Z, stencil, MSAA samples never go off-chip So it uses on-chip 16x16 tile buffer for Z and stencil. |
No description provided.
The text was updated successfully, but these errors were encountered: