opengl32: Don't prioritize low bit depth formats with non-matching stencil.
Fixes ClaDun X2 missing rendering parts on AMD. The GL game uses stencil and requests pixel format with depth 16, stencil 8. On Wine / AMD that ends up with 16x0 format as there is no 16x8 advertised (16x0 only) and depth takes absolute priority. On the same Windows machine I see both 16x0 and 16x8 formats, but when 16x8 is requested 24x8 is returned (and 16x0 if 16x0 is requested).
That currently works under Wine / Nvidia because there is no 16 bit depth formats advertised at all and it ends up with 24x8 without this patch.
Merge request reports
Activity
assigned to @Mystral
350 351 pfd.cDepthBits = 16; 352 pfd.cStencilBits = 8; 353 ok( test_pfd(&pfd, &ret_fmt), "depth 16, stencil 8 failed.\n" ); 354 ok( ret_fmt.cDepthBits >= 16, "Got unexpected cDepthBits %u.\n", ret_fmt.cDepthBits ); 355 ok( ret_fmt.cStencilBits == 8, "Got unexpected cStencilBits %u.\n", ret_fmt.cStencilBits ); 356 pfd.cDepthBits = 0; 357 pfd.cStencilBits = 0; 358 359 pfd.cDepthBits = 8; 360 pfd.cStencilBits = 8; 361 ok( test_pfd(&pfd, &ret_fmt), "depth 8, stencil 8 failed.\n" ); 362 ok( ret_fmt.cDepthBits >= 16, "Got unexpected cDepthBits %u.\n", ret_fmt.cDepthBits ); 363 ok( ret_fmt.cStencilBits == 8, "Got unexpected cStencilBits %u.\n", ret_fmt.cStencilBits ); 364 pfd.cDepthBits = 0; 365 pfd.cStencilBits = 0; - Comment on lines +345 to +365
I have tweaked / extended the tests a little further, see gl-depth-stencil.txt. I only tested that on Nvidia for now, curious if they also pass with AMD with those changes.
I run the test on my AMD / Windows and added some traces / additional tests. I am attaching diff to the test (which includes your patch as well) and the output from AMD / Windows.
It seems that unfortunately 32/8 tests is a bit inconclusive here as somehow the output pixel format is 32 / 8 (honestly not sure what that means but that's what I see here; see no test failure on line 382 and trace output from line 381).
The rests of tests suggest that it prefers 24 bit whenever in doubt, see, e. g., 8x8 test, trace at line 364: it could choose 16x8 but preferred 24x8. From what I see it seems that the pattern is whenever stencil is requested it returns depths >= 24. That's what my current patch is doing. If we prioritize stencil formats when requested over depth match that will probably look more straightforward logic-wise but that would break all those tests if we'd tighten them to what is actually returned on AMD. Also, if specific games depending on the stencil choice are concerned, if we plainly prioritize stencil presence they will get 16 bit depth on Wine while getting 24 on Windows and that difference may matter (even if not break the thing completely like returning no stencil format).
What do you think?
It seems pretty clear and reasonable to me actually. It looks like ChoosePixelFormat() returns a pixel format with stencil bits when requested so. Matching depth bits comes at a lower priority.
As it turns out, the only actual pixel format with stencil bits that's supported everywhere is D24S8. D32S8 is apparently supported on Windows AMD, but not on Windows Nvidia. FWIW Linux AMD doesn't return any D32 visual / fbconfig at all.
I'm attaching some more changes on top of yours: choosepixelformat.txt. Basically I swapped the two blocks for stencil and depth in wglChoosePixelFormat(), making sure we take care of stencil before checking depth.
Also adding the output of the tests on Windows Nvidia after my patch.
464 465 count = wglDescribePixelFormat( hdc, 0, 0, NULL ); 465 466 if (!count) return 0; 466 467 468 cDepthBits = ppfd->cDepthBits; 469 if (ppfd->dwFlags & PFD_DEPTH_DONTCARE) cDepthBits = 0; 470 else if (ppfd->cStencilBits && cDepthBits <= 16) 471 { 472 /* Even if, e. g., depth 16, stencil 8 is available Window / AMD may return 24x8 (and not 16x0). 473 * Adjust to 24 as 24x8 is universally available and we won't end up without stencil. */ 474 cDepthBits = 24; 475 } 476 - Comment on lines +468 to +476
So, as I mentioned to you privately before, this seems very ad-hoc i.e. I'd expect ChoosePixelFormat() generally, or at least usually, to return a pixel format with stencil bits when the PFD explicitly asks for it. I'd like to see more evidence that's not the case and an improved test (or comment, if it can't be properly shown in a Wine test) to back it up.
Also nitpick, typo "Window".
changed this line in version 2 of the diff
unassigned @Mystral
assigned to @gofman
requested review from @Mystral
added 559 commits
-
bb784677...f6f66661 - 557 commits from branch
wine:master
- de8318e7 - opengl32: Prioritize stencil check over depth check in wglChoosePixelFormat().
- 86b4fa38 - opengl32/tests: Add more tests for ChoosePixelFormat().
-
bb784677...f6f66661 - 557 commits from branch
unassigned @Mystral
added 38 commits
-
86b4fa38...28ba7e41 - 36 commits from branch
wine:master
- cc8d8b5e - opengl32: Prioritize stencil check over depth check in wglChoosePixelFormat().
- 83cbcdef - opengl32/tests: Add more tests for ChoosePixelFormat().
-
86b4fa38...28ba7e41 - 36 commits from branch