Closed TorutheRedFox closed 4 years ago
Do you mean that the plugin should apply color dithering to 16bit color buffer, which is copied from video memory to RDRAM?
for accuracy purposes, yes
Which matrix types does original hardware use? Bayer matrix 3x3, 4x4 or 8x8? How does the ordered grid magic square pattern look like?
it's a 4x4 matrix for both, and the matrices are static const uint8_t bayer_matrix[16] = { 0, 4, 1, 5, 4, 0, 5, 1, 3, 7, 2, 6, 7, 3, 6, 2 };
static const uint8_t magic_matrix[16] = { 0, 6, 1, 7, 4, 2, 5, 3, 3, 5, 2, 4, 7, 1, 6, 0 }; (taken from Angrylion RDP Plus)
This looks good. Where is the right location to add dithering?
Add if( uColorDitherMode == 0) and if (uColorDitherMode == 1) here? https://github.com/gonetz/GLideN64/blob/0692abea4aa4ed33b7cd3e7def1f24560e513bcd/src/Graphics/OpenGLContext/GLSL/glsl_CombinerProgramBuilder.cpp#L701
Or here before color quantization? https://github.com/gonetz/GLideN64/blob/44ce554d5f9d22e49e60a74e39b73fbae991b733/src/BufferCopy/WriteToRDRAM.h
I implemented those into a shader, and my method anyways requires it to be done before the quantization. Example code I made which also does the 21-bit quantization (albeit without dithering this time), which works in Unity just fine:
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float4 uv : TEXCOORD0;
};
struct v2f
{
float4 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
float4 screenPosition : TEXCOORD1;
};
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.screenPosition = ComputeScreenPos(o.vertex);
o.uv = v.uv;
return o;
}
sampler2D _MainTex;
sampler2D _DitherTexture;
float4 _DitherTexture_TexelSize;
float4 _MainTex_TexelSize;
fixed4 frag (v2f i) : SV_Target
{
float2 screenPos = i.uv.xy / i.uv.w;
float2 ditherCoordinate = screenPos * _MainTex_TexelSize.zw * _DitherTexture_TexelSize.xy;
float gamma = 2.2;
float orig_bpp = 32; //original bpp of image in RGBA
float bpp = 15; //target bpp in RGBA
float bpc = bpp / 4;// this makes it bits per channel (RGBA)
float dither = (tex2D(_DitherTexture, ditherCoordinate).r * (28 / (bpc))) -((orig_bpp * 16) / (orig_bpp * 32));
dither = pow(dither, gamma);
float dither2 = (tex2D(_DitherTexture, ditherCoordinate).g* (28 / (bpc))) - ((orig_bpp * 16) / (orig_bpp * 32));
dither2 = pow(dither2, gamma);
float4 tex = tex2D(_MainTex, i.uv);
float4 col = tex;
col = round((col + clamp(dither2 / (bpp / 2), -1, 1)) * (bpc * 32)) / (bpc * 32);
col = round(pow(col, gamma / gamma) * (7*32))/(7*32);
return col;
}
ENDCG
}
I'm not sure how accurate this is to actual N64 dither, but it uses the Bayer matrix values directly set as color values in a 4x4 texture (4 in matrix remains as 4 in R channel)
well it's only emulating one dithering mode different dither modes use a different mix of matrices and coordinate calculations
although, about the second round of col, which is the 21-bit quantization, you can just get rid of the line and replace it with col = pow(col, gamma / gamma);
Where is the right location to add dithering?
The plugin renders (or blits) original hi-res image into a small one, which has size of frame buffer in RDRAM. Then this smaller image copied from VRAM to RDRAM with color quantization. I see two options to add color dithering step:
@ToruTheRedFox Thanks for the code example. TBH, I never learned how color dithering works. May be your code is suitable for our needs. However, I don't understand some magic in your code:
pow(col, gamma / gamma)
Is not gamma / gamma == 1 and pow(col, gamma / gamma) == col? bpc = bpp / 4
. Is it correct? Target color has 5 bits for RGB and 1 bit for alpha. 28 / (bpc)
What is 28?the code is kinda messy I know the 28 I can't remember, and gamma / gamma was me being dumb and bpp is done in RGBA because of display not needing the alpha channel usually, making me overlook accuracy on the alpha part I just messed with values at the end so it doesn't lighten the image, which is why some of the numbers are seemingly random
documented it a little better and removed the pointless gamma/gamma lol
edit: hold on I messed up lol
fixed4 frag (v2f i) : SV_Target
{
float2 screenPos = i.uv.xy / i.uv.w;
float2 ditherCoordinate = screenPos * _MainTex_TexelSize.zw * _DitherTexture_TexelSize.xy;
float gamma = 2.2;
float orig_bpp = 24; //original bpp of image in RGB
float bpp = 15; //target bpp in RGB
float bpc = bpp / 4;// this makes it bits per channel-ish
float dither = (tex2D(_DitherTexture, ditherCoordinate).r * (28 / (bpc))) -((orig_bpp * 12) / (orig_bpp * 24));
dither = pow(dither, gamma);
float dither2 = (tex2D(_DitherTexture, ditherCoordinate).g* (28 / (bpc))) - ((orig_bpp * 12) / (orig_bpp * 24));
dither2 = pow(dither2, gamma);
float4 tex = tex2D(_MainTex, i.uv);
float4 col = tex;
col = round((col + clamp(dither2 / (bpp / 2), -1, 1)) * (bpc * 32)) / (bpc * 32);
col = round(col * (7*32))/(7*32);
col.w = round(col.w);
return col;
}
ENDCG
}
@gonetz I have tested another example which can be ported to C++: https://gist.github.com/gizmo98/0097b55c96e93f84cfca84d2b4d75977 https://gist.github.com/gizmo98/e64064d41ae2344487b4907f6cf36ed5
I have used 4x4 bayer matrix. Values can be replaced with values angrylion is using.
Description: https://en.wikipedia.org/wiki/Ordered_dithering
In the picture posted here https://github.com/gonetz/GLideN64/issues/2178 color banding is quite obvious (gras). Angrylion is using dithering. https://user-images.githubusercontent.com/15708278/74653796-d0dc8c80-5180-11ea-9b85-127b3e521490.png
If I may comment, totally necessary for native and should be default, but should be optional for the case of high resolution textures
@weinerschnitzel This is for the framebuffer effects only
Original
RGB555
Angrylion Bayer matrix
Angrylion Magic Square
Bayer Matrix 4x4 {0,8,2,10}, {12,4,14,6}, {3,11,1,9}, {15,7,13,5}
Real nasik Magic Square {0,14,3,13}, {11,5,8,6}, {12,2,15,1}, {7,9,4,10}
@ToruTheRedFox Could you try out this branch? https://github.com/gonetz/GLideN64/compare/master...gizmo98:dithering?expand=1
MK64: Jumbotron Castlevania: Data selection screen. Magnifying glass looks different Conker BFD: Pause screen loos different Zelda MM: Intro full screen effects loo different
'Real nasik Magic Square' looks the best IMO.
@oddMLan to prevent to much artifacts a higher order matrix could be used. Authentic or unobtrusive dithering?
Bayer Matrix 8x8
Bayer Matrix 16x16
imo there should be options for which matrices are used, if any, for those who for some reason like color banding, those who want accurate visuals with a good framerate (CPU too slow for Angrylion RDP), or those who want better dithering
I made some screenshots master vs my dithering branch:
the color banding looks absolutely horrendous in a lot of places
I see why nearly every early 3D console used it
Like Beetle-PSX-HW, I think there should be two dithering options: native (accurate) and match with output resolution (enhanced). I think it should be the same with the noise generator, but that's another subject.
@gonetz Do you know when dithering should be used for colorbuffer to rdram? Should i check gDP.otherMode.colorDither and add Bayer, MagicSquare or Noise accordingly?
I have also found this testrom which can be used to test dither modes: http://mattpierce.info/n64-dither
Do you know when dithering should be used for colorbuffer to rdram?
No, but I suppose that you're right and dithering is controlled by gDP.otherMode.colorDither
Should i check gDP.otherMode.colorDither and add Bayer, MagicSquare or Noise accordingly?
Noise dithering already implemented in shaders, so no need to add more noise with copy to RDRAM. So, either Bayer or MagicSquare should be used.
@gonetz is noise applied to the display resolution color buffer or to the shrinked version for RDRAM? Noise can be also used to prevent color banding a little bit.
well it's mainly to make the RDRAM framebuffer more accurate, but it can possibly be used for the main buffer drawn onscreen with the "Render N64 frame buffer to output" option disabled
is noise applied to the display resolution color buffer or to the shrinked version for RDRAM?
It is applied to display resolution color buffer. You may see it in Zelda MM into - backgrounds have some random noise.
If picture is shrinked to native res noise is certainly no longer visible and cannot hide color banding.
I have added shaders for Magic Square and Bayer dithering. I have not done quantization because we would really lose 3Bit of precision: https://github.com/gizmo98/GLideN64/blob/dithering2/src/Graphics/OpenGLContext/GLSL/glsl_CombinerProgramBuilder.cpp#L701
If picture is shrinked to native res noise is certainly no longer visible and cannot hide color banding.
It is not so. Noise dithering is scaled with resolution. Here how it looks in native res:
I have added shaders for Magic Square and Bayer dithering.
The results look cool, thanks! Dithring adds authentic look to native-res mode. What is good, it is not noticable at resolutions above native. However, I personally would prefer to see image without dithering pattern. So, color dithering should be optional, imo. Currently it is controlled by config.generalEmulation.enableNoise. It is not right. Noise dithring is a kind of special effect, and I prefer to have it enabled. Other dithering modes are for peoples, who prefer authenticly looking image. So, I suggest to add new setting: enableDithering with possible values:
When enableDithering is 'none' or 'only noise', color dithering should be performed when image copied to RDRAM. That is, image in RDRAM should always be dithered if gDP.otherMode.colorDither is G_CD_MAGICSQ or G_CD_BAYER.
If there is a dither shader for G_CD_MAGICSQ and G_CD_BAYER image in RDRAM should not be dithered if nativeResFector = 1. But if displayed res is higher and dithered image will be resized to lower RDRAM fb resolution, dithering gets lost and color banding is back again:
Since all of you are improving things related to noise, can anyone take a look at Silicon Valley?
That game uses many noise effects, and some are currently broken (screens).
So, I suggest to add new setting: enableDithering with possible values:
- none
- only noise (default)
- only Magic Square and Bayer
- all
It is not so. Noise dithering is scaled with resolution.
If you're grouping them together, it's kinda confusing to have one kind of dithering (noise) to "scale up" (pixels becoming bigger) with screen resolution while the other kind of dithering remains high-def (pixels remain small), isn't it? I believe there should at least be an option to make noise to match the output resolution as well.
Current Master vs Dithering3 branch:
I also modified noise dithering a little bit so test rom looks more like reference pictures. It works like bayer and magic square dithering now.
void colorNoiseDither(in lowp float _noise, inout lowp vec3 _color)
{
mediump float threshold = 0.03125 * (_noise - 0.5);
_color = clamp(_color + threshold,0.0,1.0);
}
And the worst looking game is ...
Wow, noise dithering looks so much more faithful to the real console in those screenshots.
@gizmo98 Can you compare the Super Mario 64 invisible cap effect across GLideN64 master, the Dithering3 branch and Angrylion (VI filters disabled) please?
Another good example would be the Paper Mario prologue when you meet the Star Spirits at the beginning of the game, the text has noise dithering.
@oddMLan Do you mean this text? (Last picture shows alpha dithering with bayer or magic square?!)
If there is a dither shader for G_CD_MAGICSQ and G_CD_BAYER image in RDRAM should not be dithered if nativeResFector = 1. But if displayed res is higher and dithered image will be resized to lower RDRAM fb resolution, dithering gets lost and color banding is back again
True.
Wow, noise dithering looks so much more faithful to the real console in those screenshots.
An improvement indeed. alphaNoiseDither should be changed that way too.
If you're grouping them together, it's kinda confusing to have one kind of dithering (noise) to "scale up" (pixels becoming bigger) with screen resolution while the other kind of dithering remains high-def (pixels remain small), isn't it?
Actually, no. Noise dithering looks the same on every resolution. All dithering modes look consistent in native res. Other kinds of dithering are not visible in higher resolutions, so no confusion here.
I believe there should at least be an option to make noise to match the output resolution as well.
Noise dithering is kind of special effect. Super Mario 64 invisible cap, dust in Zelda OOT, text in Paper Mario prologue. All these effects look incorrect with noise matching the output resolution. You may check them with early versions of GLideN64, which use non-scaled noise. I remember complains on low-res looking noise, but I don't understand them.
@gizmo98 You work looks great, thanks! I think, the only thing to do is adding enableDithering option.
Some effects cannot be seen without alpha dithering (no dithering vs dithering):
Whats the right color dithering (same noise value for rgb vs different noise values for r, g and b)? I prefer the loo of the second one.
And finally i added a noise function wich doubles native noise resolution but does not use output resolution. Does not look low res but is still rough.
Whats the right color dithering (same noise value for rgb vs different noise values for r, g and b)? I prefer the loo of the second one.
I too prefer the second one (it looks more organic), but what's supposedly more accurate to the N64? How does AL create dithering noise?
And finally i added a noise function wich doubles native noise resolution but does not use output resolution. Does not look low res but is still rough.
Yes, I like that. Maybe can be put in a checkbox: "High resolution noise".
Some effects cannot be seen without alpha dithering (no dithering vs dithering):
I'm not sure that this is an intended effect, it looks weird. May be it looks like this because noise is missing? It looks different with AL.
Whats the right color dithering (same noise value for rgb vs different noise values for r, g and b)? I prefer the loo of the second one.
If I correctly understand AL RDP sources, it depends on other_modes.rgb_dither_sel:
if (other_modes.rgb_dither_sel != 2)
rcomp = gcomp = bcomp = dith;
else
{
rcomp = dith & 7;
gcomp = (dith >> 3) & 7;
bcomp = (dith >> 6) & 7;
}
see rgb_dither and get_dither_noise functions.
Some effects cannot be seen without alpha dithering (no dithering vs dithering):
I'm not sure that this is an intended effect, it looks weird. May be it looks like this because noise is missing? It looks different with AL.
Is there no or a different effect with AL? Because it looks like a text limitation and a planet?!
Whats the right color dithering (same noise value for rgb vs different noise values for r, g and b)? I prefer the loo of the second one.
If I correctly understand AL RDP sources, it depends on other_modes.rgb_dither_sel:
if (other_modes.rgb_dither_sel != 2) rcomp = gcomp = bcomp = dith; else { rcomp = dith & 7; gcomp = (dith >> 3) & 7; bcomp = (dith >> 6) & 7; }
see rgb_dither and get_dither_noise functions.
Looks like r, g, b get different noise values if noise (value 2) is selected. https://github.com/gonetz/GLideN64/blob/04736b44c5b4299365e9bb331368ee1ac1a93ab7/src/GBI.h#L346
Is there no or a different effect with AL? Because it looks like a text limitation and a planet?!
AL with VI filtering off:
Looks like r, g, b get different noise values if noise (value 2) is selected.
Yes. One note: AL code does not call irand function 3 times for r, g, b. irand returns 16bit pseudo-random value. Only 3 less significant bits of 8bit color components are randomized, so only 9 random bits are needed. Thus, one irand call is enough.
Test build: https://drive.google.com/file/d/1S97j1zYHfDb8iZ1La4NGVF39y60vXteI/view?usp=sharing
it is for mupen64plus 32bit only. GUI control is not implemented yet.
@ToruTheRedFox Could you test the build?
Screenshots branch dithering ditherMode = 4 + FXAA:
Test build for Project64 32bit: https://drive.google.com/file/d/14sYmdFwLJ4mac7PyZg-6PhWHGxWl8BfR/view?usp=sharing
I did a quick test. Defaults, Native Res, Render N64 frame buffer to output As you can see the dithering hides the banding from the 16 bit conversion quite nicely.
Noise and ordered grid dithering, Native Res
I'm a little confused, since the buffer is already 32-bit I thought there would be no dithering, because what benefit does it give other than "simulating" the dithered look? There is no banding to hide.
Then I found out despite setting the dithering mode to "Disabled", anything that displays the framebuffer will always show it dithered (either through a game framebuffer effect, like the reset animation, or using "Render N64 frame buffer to output). I assume this intended since @gonetz stated earlier
When enableDithering is 'none' or 'only noise', color dithering should be performed when image copied to RDRAM. That is, image in RDRAM should always be dithered if gDP.otherMode.colorDither is G_CD_MAGICSQ or G_CD_BAYER.
So let me understand this, the default settings already adds dithering only when it's absolutely necessary to avoid banding artifacts due to the 16-bit color color conversion that happens with framebuffer effects? And any other selection under Dithering mode would be "forcing" the dithering in 32-bit color mode, despite it not being necessary? I think the naming should be a little more clear then, perhaps adding a "Force" prefix.
Also the "Enable Noise" setting under the Emulation tab seems kind of redundant and it should be removed from the UI. OR it could be substituted by a "hi-res" noise setting (with 2x being the upper limit since @gizmo98 said it still looks hi-res without looking wrong).
I now tested using this test rom.
Native Res, Render N64 frame buffer to output
Dithering> "Disable" doesnt actually disable the dithering in the Framebuffer, even when the test rom requests so. However, it does disable noise dithering. It seems to force Bayer dithering always.
Dithering > "Only noise dithering (default)". Enables Noise dithering. Like above, it seems to force Bayer dithering.
Dithering > "noise dithering with 5bit quantization". Noise looks a bit noisier. Forces Bayer dithering otherwise, even when requesting "Disabled" in the test rom.
Dithering > "noise and ordered grid dithering". Noise looks more homogeneous. Dithering follows what's currently selected in the test rom (Bayer or Magic Square). Selecting "Disabled" in the test rom actually disables the dithering, as it should.
Dithering > "dithering with 5bit quantization". Noise still works (it shouldn't judging by the name). Dithering follows what's currently selected in the test rom (Bayer or Magic Square) and the colors are darker. Selecting "Disabled" in the test rom actually disables the dithering, as it should.
The settings are extremely confusing as they currently are right now. To me, there should be only the following settings that control the 16-bit frame buffer:
EDIT: I now realize some games like Mario Kart 64 for some reason don't enable dithering in the RDP, must be because the other VI filters seem to take care of the banding. Then my proposal could be like this, to give more granularity to the user:
16-bit dithering mode:
Checkboxes: ☑ Enable noise ☑ Enable quantization ☑ Force dithering pattern in 32-bit output
After some meditation, this is the best I could come up with. Conflating dithering & noise actually makes things more complicated.
My theses and explanations:
Regarding options, I'd make them as following:
Thus, one option removed, two added.
Dithering is honestly necessary to produce a correct (and actually somewhat good looking) N64 framebuffer, as it's quantized with no dithering, which deviates from hardware, which does infact have dithering