libretro / beetle-psx-libretro

Standalone port/fork of Mednafen PSX to the Libretro API.
GNU General Public License v2.0
309 stars 130 forks source link

HW: Dithering is missing? (OpenGL+pure native) #441

Closed m-7761 closed 5 years ago

m-7761 commented 5 years ago

Just FWIW I'm not seeing dithering, except oddly the "software frame buffer" is dithered (when working that is: #427)

I realize shaders can implement dither, and it would be bad to dither on top of dither, however, I worry there is no way to see a PS game as its creators intended it to be seen.

hizzlekizzle commented 5 years ago

Is that even at 1x? I think the problem is that the dithering scales with the internal res..?

rz5 commented 5 years ago

The vulkan hardware renderer does not draw the PSX dither pattern like the software and opengl renderers. If you insist on hardware accelerated upscaling, you're stuck with the OpenGL renderer.

m-7761 commented 5 years ago

FWIW I am using OpenGL/GLSL as far as I know. The internal resolution is "native," the dithering is "native." Video Driver is "gl."

EDITED: Just out of curiosity, why should it not just work? Unless disabled?

rz5 commented 5 years ago

@mick-p1982 - A quick side-note: new issues on this repo now start with a template that should be filled; in it there are fields that facilitate diagnosing what's up. Did you explicitly delete the template or did this github feature fail somehow?

If you're sure the OpenGL renderer is being used, then you will not see the PSX dithering pattern if the 'Internal color depth' core option is set to '32bpp', for example. Excluding that case, it should be present.

To illustrate what I'm seeing on my side, here's beetle-psx running Ridge Racer Type 4, with 'Renderer (restart)' set to 'hardware' (RA's video driver is set to 'gl'), 'Dithering Pattern' set to '1x (native)', everything else is default settings:

r4 - ridge racer type 4 usa -181101-111242

Now I set 'Dithering Pattern' to 'off' and here's what I see:

r4 - ridge racer type 4 usa -181101-111315

m-7761 commented 5 years ago

Ah yes, that must be it! I think this is very confusing, and a note should be provided, options disabled etc., however that being said, 32bpp and dithering are not mutually exclusive, since you are only picking up color information from blending between the original colors (mipmaps, etc.) and not generating new colors, which dithering can help with, and dithering maintains the original style of the game, and dithering can help with visual fidelity also (albeit, not unless the dithering is independent of the internal resolution.)

If I was responsible for this, I would implement dithering just to avoid bug reports like this one. It's much easier to do that than deal with people who are confused, and it's changing how the game appears, which is prime directive a no-no.

(Yeah, I deleted the form. To save myself trouble.)

m-7761 commented 5 years ago

EDITED: You still have a bug, in that that software frame buffer effects (when working via #427) are dithered in 32bpp mode!

No offense, but this core is pretty sloppy. I know it's just a hobby project. But think of how many lives are enhanced/affected by it :)

m-7761 commented 5 years ago

FWIW changing to native/dither (off 32bpp) fixes #427, which renders black for Frame Buffer setting when games are started. (And turns on only if you go into options and change a setting on/off --- any setting.)

m-7761 commented 5 years ago

_Off-topic: I've searched around for info on how the PS accomplished dithering, to no avail. Can anyone shed light on this for me? It's for a good cause: I'm preserving/porting a game to Windows (http://www.swordofmoonlight.net/bbs2/index.php?action=dlattach;topic=286.0;attach=948;image) with the company's own RPG Maker (Sword of Moonlight.)

Is the 32bpp output mode just the result of lighting/color values assigned to the primitives? Because typically textures are 5-bits per channel. Is the dither just added noise to reduce banding? Or is it approximating the 8:8:8 output after primitive coloring? Or both?_

rz5 commented 5 years ago

AFAIK, the dither pattern was there exactly to hide the obvious color banding caused by the usage of 16-bit color. The reasoning behind why the 32bpp setting disables the dither mask is because, well, color banding doesn't happen anymore, so the dither pattern would just be noise at that point. To illustrate, compare this 16bpp un-dithered image: r4 - ridge racer type 4 usa -181101-111315 With the 32bpp equivalent: r4 - ridge racer type 4 usa -181101-162956

If you're fluent in OpenGL, search for the keyword "dither" in this file: https://github.com/libretro/beetle-psx-libretro/blob/master/rsx/shaders_gl/command_fragment.glsl.h If you do want more info on the nitty gritty of the PSX dither pattern, what it's applied to, whether or not different versions of Playstation hardware implement different patterns of different intensities, I am sure you will eventually stumble upon forum posts made by nocash, AmiDog, Shadow, among others.

And also, if you're fluent in C/C++, please consider sending PRs to this repo.

m-7761 commented 5 years ago

Your 32bpp screenshot doesn't look like mine. My 32bpp screenshot is banded, exactly like the dithered one. I work with dithering a lot. It sounds like maybe the PlayStation isn't really doing dithering, but is just adding noise to the picture, using a dither pattern. Dithering works by taking, say a 16-bit input signal and converting it into an 8-bit output signal. (What confuses me about the PlayStation I don't know what is the input/output context for the PlayStation. Or if it's doing true dithering.)

I wonder if your 32bpp screenshot is simply doing linear interpolation across the bands, and so if the the bands were larger than a 1px span, that there would still be banding present.

I hate to beat a dead horse, but I am using the nearest-neighbor shader to do color matching, so I need to investigate other shaders perhaps. But I think with the NTSC shader it looked the same. I think maybe your blue sky example is not the best example. Thanks for contributing.

And also, if you're fluent in C/C++, please consider sending PRs to this repo.

I'm very fluent. When it comes to improving code, what really matters is familiarity with the code base in question. That is why it makes much more sense to explain problems, because programmers who know the code base (maintainers mainly) can implement fixes/changes 100 times faster than users who don't know where anything in the code is, and outsiders contributing code not only waste a lot of their own personal time, their lack of familiarity with the code tends to breed problems, that ultimately harm the project by a thousand cuts. Coders should either be dependent on the code, or it should be part of their lifestyle, or they are better off alerting devs to problems. If code lacks an invested maintainer it's as good as dead.

m-7761 commented 5 years ago

EDITED: For what it's worth, I don't stand behind Closing this matter, except for #427 is able to carry its water, as long as it is not closed. It would be better perhaps to resubmit the problem, as one of the Option screen needing work, or 32bpp mode needing a dither pattern.

rz5 commented 5 years ago

@mick-p1982 If you are using shaders in addition to using the 32bpp option AND you're still seeing ugly color banding, then if you're using RA as your frontend, do check the state of Settings > Video > Force-disable sRGB FBO.

Some time ago I was using this core, running crt-royale and had the above option on. I was playing a game where a dark pre-rendered background looked very, very off from what I remembered (severe color banding, shadows looked burnt out). It was the video option I'm talking about. So heads up about that issue.

As for what the 32bpp option does in the OpenGL renderer, it changes the texture format of the output framebuffer from GL_RGB5_A1 to GL_RGBA8 (https://github.com/libretro/beetle-psx-libretro/blob/master/rsx/rsx_lib_gl.cpp#L1325-L1336)

m-7761 commented 5 years ago

@rz5 I will... actually I was messing with Force-disable sRGB FBO at the same time. I was wondering if sRGB would change the color, or if people really know what it means. It seems like sRGB should not be on by default to me, so I turned it off. I still intend to investigate. And I am pretty interested in PlayStation operation, so it's possible at some point I will inadvertently get involved on the programming side. If I ever have an excuse to learn my way around the code bases.

FWIW: Code should not be using sRGB mode unless shaders require it to implement very physical stuff (simulating photons, maybe inside a virtual display or something.) I expect whoever added sRGB to the system doesn't understand what it is. I certainly would not want PS games to use sRGB for shading, because it requires very dense tessellation to look decent, and we all know PlayStation games are low-poly.

GL_RGBA8 should be used for everything, unless 10-bit color or more is offered for some reason. There's no reason to use GL_RGB5_A1, and it's very possible it won't always work. I'm surprised it's supported for more than backward compatibility sake. I doubt any mobile devices use 5-bit color today. But it's actually a good pixel model for artists to use. Artists can see the difference in 5-bit color, from shade to shade, but LED displays usually can only show about 75% of 8-bit color. Old CRT displays actually had more color depth than LED displays. And even then the eye cannot see the difference for some shades.

hizzlekizzle commented 5 years ago

sRGB framebuffers are used by multipass shaders quite often to reduce re-linearizing/de-linearizing on each pass.

m-7761 commented 5 years ago

Do you mean a floating-point format frame buffer? FWIW an sRGB frame buffer is a regular frame buffer, it just automatically converts the result of the pixel shader from linear sRGB space to classical 0~255 pixel values. So you can't store it or anything.

The problem with linear sRGB is real life lighting is very harsh, and unless your mesh is very dense (which poses a lot of problems for level-of-detail) then there aren't enough vertices to grab lighting information from, and so the interpolated values between vertices typically isn't good enough to look nice, even with per-pixel lighting. IOW, if it wasn't for the old nonlinear sRGB standard, the PlayStation would have looked really ugly. That's why no one really worried about the problem.

P.S. I hate to go off-topic always, but while I'm here (goofing) I noticed Beetle looks a lot better with a basic point-filter (nearest neighbor) playing the game I'm in the process of porting than I can get it to look, on my workstation's chipset. The problem with a point-filter is it doesn't do well with standard mipmapping. All my tests with different setting in the samplers and shaders look pretty crappy in the minification filter, but Beetle looks good. I just wonder if you are aware of any technique it might be using? It may just be that the game only has one speed of movement, that is pretty fast, and so it's always either stopped or moving fast, but it doesn't seem that way exactly to me. I can't think of anything except maybe changing how the mipmaps are generated. Sorry to ask. I just have an account here, and not elsewhere.

hizzlekizzle commented 5 years ago

I was referring (and I believe the force-disable option refers) to the render target formats for shaders: https://github.com/libretro/slang-shaders#8-bit

As for the mipmapping, I don't think we do anything special for that. Are you comparing against the GL renderer or Vulkan?

m-7761 commented 5 years ago

As for the mipmapping, I don't think we do anything special for that. Are you comparing against the GL renderer or Vulkan?

GL. Sorry to continue this discussion. I hope you don't mind. BTW, just to spread ideas, I think Beetle would benefit if possible, by providing an option to set 0 in pixels to 1. This is because, on a lot of monitors, 0 is treated as pitch black, or equal to black mattes, and the difference between 0 and 1 is very stark, and looks really bad. It does give the picture a faint blow, but it looks better than black voids. (Increasing brightness 1 can also work, but hits every pixel instead of problem pixels.)

(I'm thinking about simplifying the code here https://github.com/castano/nvidia-texture-tools/blob/master/src/nvimage/Filter.cpp to produce mipmaps that are hopefully less smudgy. Point filter is an interesting look if a game's artwork suits it. I wouldn't be surprised if Beetle does something because it looks really dirty if you do nothing.)

UPDATE: @hizzlekizzle I figured out the point-filter. (My minification was set to a pseudo-filter called D3DTEXF_ANISOTROPIC even without anisotropic filtering. My driver took it to mean: generaete mipmaps/force mipmapping. It looked like the hardware was married to mipmaps!)