icculus / Serious-Engine

An open source version of a game engine developed by Croteam for the classic Serious Sam games.
GNU General Public License v2.0
164 stars 21 forks source link

(OSX) Rendering problems #37

Closed DanielGibson closed 8 years ago

DanielGibson commented 8 years ago

There are rendering problems in OSX, both 32bit and 64bit, both first and second encounter. It's OSX 10.11.4 on a i7-4570R with Iris Pro graphics.

I also get an assertion failed (on startup in TFE, in the first level on TSE), from Grx_wrapper.cpp:154 pglGetError()==0. Unfortunately I suck at using OSX and OpenGL, so it'd be cool if someone else could reproduce and debug the problem ;-)

tfe-bug tse-bug

icculus commented 8 years ago

I think this is a missing depth buffer, I've seen this on Linux when changing from fullscreen to windowed.

rohit-n commented 8 years ago

I can reproduce the second screenshot on Linux when switching resolution before starting a new game.

DanielGibson commented 8 years ago

The problem is that for some stupid reason in CGfxLibrary::SetupPixelFormat_OGL() gap_iDepthBits <12 so SDL_GL_DEPTH_SIZE is set to 0. If I set gap_iDepthBits = 16; in the <12 case it works.

Haven't really figured out why that is <12 in the first place, though.

rohit-n commented 8 years ago

gap_iDepthBits is first set to zero in GfxLibrary.cpp:197. However, I've noticed that the game reads from Scripts/PersistentSymbols.ini later on (I can't find exactly in the code where this happens). If it doesn't already exist, one can add

persistent extern user INDEX gap_iDepthBits=(INDEX)16;

to the file and never see this issue again.

DanielGibson commented 8 years ago

ok, /maybe/ gap_iDepthBits == 0 is supposed to mean "automatically choose suitable depth buffer bits" or "use color depth" (INDEX gap_iDepthBits = 0; // 0 (as color depth), 16, 24 or 32 suggests the latter).

If I understand the documentation of Window's ChoosePixelFormat() function correctly, it always creates a depth buffer, unless explicitly told otherwise with a flag, see https://msdn.microsoft.com/en-us/library/windows/desktop/dd368826%28v=vs.85%29.aspx :

PFD_DEPTH_DONTCARE - The requested pixel format can either have or not have a depth buffer. To select a pixel format without a depth buffer, you must specify this flag. The requested pixel format can be with or without a depth buffer. Otherwise, only pixel formats with a depth buffer are considered.

So, what should we do when it's 0? Not call SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, gap_iDepthBits); at all and assume the driver (or SDL or whoever) chooses a sane default? Just set 16? Set depending on gl_dmCurrentDisplayMode.dm_ddDepth? if so, should we use 24bit or 32bit in case of DD_24BIT?

icculus commented 8 years ago

Just set it to 16 if it's zero. There's no way this game needs more than 16.

(dm_ddDepth is color depth of the display, not the depth buffer, and is probably always 32 in modern times.)

icculus commented 8 years ago

Figuring out why this is zero in PersistentSymbols.ini is a worthwhile endeavor, too...has this always just been falling back to 0 on resolution change and no one ever noticed, or did we break it?

yamgent commented 8 years ago

I don't think we broke anything, people have spotted this bug before: https://bugs.winehq.org/show_bug.cgi?id=11970#c43

it could be a deliberate (albeit bad) decision to set it to 0. Since ChoosePixelFormat() is not obligated to give the exact pixel format, but just return a format that is as close as possible to the request, that could be the intention of setting it to 0 [see also this, although I cannot find the NVIDIA source that it is citing. In this article, Serious Sam "2" = TSE].

DanielGibson commented 8 years ago

added a fix for this to PR #44