Open romulasry opened 4 years ago
AI upscaling is typically slower-than-realtime. However, the NNEDI3 shader is an implementation of neural net-derived weights.
How about some type of running it and caching it "before" it runs?
That only works in cores that can dump games textures, and each console and core dumps them with their particular ideas and limitations. There is no standard for this that would work to recognize a 'texture' in all platforms, and there isn't even a need for one, since particular projects that are not automatic will always be better.
Just go to the pages where you can find texture replacement projects for particular emulators/cores (like the recent resident evil 2 and 3 project), there is no possible general solution here that isn't realtime, and realtime is impossible right now (and probably forever the way the world is going) and would be inferior anyway for various reasons.
If it was standardized then there wouldn't be a problem. Good point though.
That's the point, it can't be standardized because the input is too different from console to console (sometimes from game to game on certain older consoles). The reason why other runtime filters are standardized is because they're runtime and only need to operate on the final image that goes to the screen - this is too slow to be runtime, so it needs to dump textures.
Would be nice if it included them by default: https://github.com/ZironZ/NNEDI3-Slang-Shaders
This could be done in real time with Neuromorphic computing
First and foremost consider this:
Description
AI upscaling as a video setting [NO AI upscaling]
Expected behavior
Nice AI upscaled for al games. [What you expected to happen]
Actual behavior
Not implemented yet [What is actually happening]
Steps to reproduce the bug
Bisect Results
[Try to bisect and tell us when this started happening]
Version/Commit
You can find this information under Information/System Information
Environment information