Open NeccoWafer opened 5 years ago
This is using the latest commit 8363a02.
well, thanks for the capture thing
this is going to be a typical issue with no perfect solution as far as upscaling is concerned.
basically, viewport transform on the DS transforms clip coordinates (20:12 fixed-point, -1.0..1.0) into screen coordinates, which range from 0..255/0..191 and are integer. the rasterizer runs with zero subpixel precision. while your typical GPU computes colors/texcoords for positions that are about +0.5 inside each pixel, the DS computes them for the exact pixel position. which, in itself, already causes some issues with texture alignment. a prime example would be the wavy intro logo in New Touch Party Game: note that every OpenGL renderer fails at rendering it.
upscaling is another layer of issues for which there's no perfect solution. basically, two ways to get upscaled screen coordinates:
take original screen coordinates and multiply them by the scaling factor. this is the least likely to cause alignment issues, however it will result in more jittery graphics due to the limited coordinate range, which becomes more apparent the higher you crank the resolution.
calculate separate screen coordinates with more precision. this is what melonDS does. this covers the jitter problem well enough, but may cause problems on its own depending on the positions the game inputs. it's not rare that things on the DS work by sheer luck, or by some developer going "just keep fuzzing it until it fits". which means that by attempting to exploit precision the screen coords weren't intended to have, we may run into all sorts of alignment issues.
in the future melonDS's GL renderer might make more clever choices (for example, preferring method 1 when rendering 2D elements like UI elements, and applying a +0.5 offset so textures are aligned correctly...)
also, worth noting that DS games might have their own polygon alignment issues, but the way the DS rasterizer fills polygons tends to make up for it (esp if you turn on antialiasing). for example, SM64DS's course models have gaps in their geometry, and those are more noticeable without antialiasing. running these games in a different rasterizer (a PC GPU) may make those issues more apparent, and rendering at higher resolutions will only exacerbate them.
for example, have you tried running the GL renderer at 1xIR? see if the problem still happens.
Thanks for the detailed explanation. Yes the problem still occurs at 1xIR, I’ll post some more info at the native resolution later.
Okay so I checked it out using 1xIR, and it actually looks a bit worse than at 8xIR.
edgemarking is exacerbating the issue. heh
but, at 1x it uses the original (non-hires) polygon coords, so I'd have to understand how shit works and why it's not lining up correctly
and I can't load that file because you've been using some cutting-edge version of RenderDoc
Oh sorry about that I was using the nightly build, I’ll post another one later on today.
I used RenderDoc 1.4 for this one. capture.zip
thanks
I'm sure I'm late to this, but I've also confirmed this to occur on Pokemon Platinum & Pokemon Soulsilver. Was running MelonDS 0.8.2 with a GTX 1060 3GB, driver 23.21.13.9101. Seems to be a Gen IV issue specifically.
I gave it a quick look, and it seems to stem from polygon coordinates that are one-off. like, on hardware, they can be one-off and result in the same polygon, due to how the hardware calculates edge slopes.
I just noticed this problem (which is obvious on the first run) and based on Arisotura's detailed description of the problem I can add a few remarks:
The problem may be fixed by just enabling conservative rasterization. However if the NDS does actually follow the top-left rule, then conservative raster will cause double shading in adjacent triangles (this would mostly cause artifacts with transparent tris, and posibly some Z-fighting-like artifacts). Note conservative raster is only supported on GeForce 9xx and newer, and AMD Vega GPUs and newer. It's not a widely supported feature.
I stumbled upon conservative rasterisation a few days ago, in fact no other than Nvidia describes a workaround for conservative rasterisation on non GPUs which don't support it natively.
The ultimate solution is to use a compute shader rasterzer instead of the current VS/PS+HW rasterizer combo. From what I can see, melonDS barely uses the vertex shader, as it is mostly a passthrough from data that has already been processed in the CPU. Compute Shader rasterizers are usually slower; however considering the limited power of the NDS and how powerful desktop GPUs are; a CS-based rasterizer would run fast enough while allowing pixel-exact accuracy; and also enabling higher resolutions (but with tradeoffs, because like Arisotura said, at higher resolutions you either get jitter or you get potential holes in some games).
If I recall it correctly Arisotura once said she tried that in the beginning, though I can't remember why she didn't go for this approach.
Also what speaks against this approach is that it would severely limit the supported set of graphics card. Currently we only require OGL 3 with some widely available extensions, in contrast compute shaders would require OGL 4.3 atleast (or ARB_compute_shader) and a graphics card with enough brute force.
I stumbled upon conservative rasterisation a few days ago, in fact no other than Nvidia describes a workaround for conservative rasterisation on non GPUs which don't support it natively.
The workaround is very tricky to get right (too many details). For example attribute interpolation (e.g. UVs) need adjusting since the vertex positions have been changed.
Also the vertex shader needs passing the data from all 3 vertices so that the pixel shader gets access to barycentrics and can alpha test the edges; or use a Geometry Shader, which normally involves a major slowdown (though I doubt that will be an issue considering NDS low triangle throughput)
It's not something I'd recommend because it isn't 3 lines of code to change, it requires a lot of work to get right and even then it may not look correct.
The workaround is already hard to implement properly assuming you are in full control of the data (which an emulator isn't)
Also what speaks against this approach is that it would severely limit the supported set of graphics card. Currently we only require OGL 3 with some widely available extensions, in contrast compute shaders would require OGL 4.3 atleast (or ARB_compute_shader) and a graphics card with enough brute force.
I'd argue any GPU manufactured since 2010 has support for it. Any DX11-level HW can do it. GpuInfo lists 69% coverage.
macOS does not support Compute Shaders via OpenGL though, so 3D acceleration in mac would be out unless Metal is used (which wouldn't be too hard... melonDS barely uses 3D APIs to render, there's not much complexity)
But I admit if mobile enters the fray, then it's an entire different landscape (though the HW that cannot do compute cannot run melonDS at decent speed anyway...).
Regarding performance, I wouldn't worry too much, as even current CPUs are capable of using SW rasterizer at full speed using just one core alone for rasterizing. Beating The GPU At Its Own Game lists a 2x slowdown of depth-buffer only for 18k triangles and more triangles reduces gap is reduced. Note: A colour rasterizer is expected to be slower due to order dependence. Rasterizer code.
If the NDS were more powerful I'd be worried, but given the host HW and the system's oddities being emulated, I think a Compute Shader-based rasterizer makes the most sense for this case.
Not sure if this is related but something similar also happens to black and heart gold:
Switching to the software render then switching back solves the issue though:
When the game is booted with the software renderer there are no artifacts.
Curiously it doesn't happen if the opengl renderer is set to the native resolution. And also it appears that switching the opengl renderer to native then switching back to x16 also fixes the issue.
all gen 4 Pokemon games have this problem, as they run using the same engine
The artifacts i've been experiencing are worse in Black imo.
But it's probably due to having more 3d models.
Seems to also present itself with Ace Attorney investigations. In turnabout airlines with opengl when gumshoe or edgeworth wave their hands in the upper floor of the aircraft after it lands it leads to a random black line below their arm.
Any fix for this in the future or some settings to mitigate it?
as far as settings are concerned, switching to software renderer fixes it.
still happening on opengl
Any Updates?
What about a hacky fix, where you just fill those holes based on neighboring pixels?
What about a hacky fix, where you just fill those holes based on neighboring pixels?
I suppose that might be possible, using the Stencil buffer to tag pixels that have been touched; and then a postprocessing step applies a fill filter.
However such hack could break in various number of ways, specially when the game deliberately wants to produce gaps or not touch pixels.
And that is assuming the black pixels are pixels that have never been touched, which may not actually be true.
I hope this issue gets fixed at some point. It's basically the fastest way to tell someone's playing their Pokémon game on an emulator.
Random black lines using OpenGL:
(This is not an issue using Software 3D Renderer.)
GPU & Driver:
AMD Radeon Vii: Adrenalin 2019 Edition 19.6.2(Windows 10)
RenderDoc of a single frame:
(Not sure if this helps.) capture.zip