Open Frooxius opened 1 year ago
Possibly related issue
They’re independent of one another.
I wouldn't quite say that. One way to achieve them is by having the building blocks for each in the ProtoFlux and blending each shader type in code, which should be something you should be able to do.
I'll expand upon it in #654 once I've collected my notes (some of them are a couple of years old, and I've only just started recompiling them along with some newer stuff around this) - but the tl;dr is material layers are intended to be mostly independent from shaders. Shaders become inputs into material layers, which has the natural extension of ProtoFlux shaders "just working" for material layers. Whether or not material layers will ship before custom shaders is a toss up however - they can be done in Unity with similar levels of complexity.
From a broader ProtoFlux standpoint and how they'd interact with material layers once we have custom shaders - there would be some evolution of the feature I'd say that enable more optimized blending and compositing of layers up to limits inherit to how shaders work - but that's something that would come as an incremental improvement on top of material layers themselves that you can optionally make use of depending on the specific visual complexity that you need for a given effect. This assumes material layers come first - and that decision hasn't been made yet.
As a graphics programmer, I have several questions surrounding this feature. I do not expect it to even be possible to answer most of them at this point in time, but I would like to at lease raise them for your consideration:
You describe making it easy to set up common shading types, but will it also be possible to create free-standing shaders with entirely custom fragment and vertex shader code? And what about shaders that modify or even inject into the vertex stream dynamically (i.e. mesh deformations)?
Will it be possible to render a shader’s output to an intermediary pixel buffer (a texture), instead of onto a mesh’s fragments?
Will it be possible to access engine internal buffers (i.e. the Depth Buffer)?
What about doing "grab passes" (or anything to access the pixel data of the render target)?
It is a little bit unclear from your wording, but I assume you are intending to create a system that transpiles ProtoFlux to code in some common shading language (HLSL or GLSL, I assume). If this process involves creating a complete shading language source code as an intermediary step in the process, would it be possible to allow users to skip past ProtoFlux entirely and simply import shading language source code from a text asset directly? This would particularly benefit most experienced graphics programmers who have most (if not all) their experience in working directly in shading language and would allow them to use techniques that may not be supported, or not possible to be easily supported, by a ProtoFlux -> shader transpiler.
Lastly, I assume this will end up with there being ProtoFlux nodes that can only be used in CPU code, and nodes that are specific to shaders. This raises some concerns about UI, such as if these types of nodes will be visually marked differently and can be filtered for in the node browser or if the term "ProtoFlux" should even be used identically for both CPU and shader code (its already confusing me a little bit just reading this issue).
Right now we're looking at the mesh and fragment pipelines, with compute on the side. The mesh pipeline will be emulated in a compute shader on older hardware, and will have some built in caps on primitive emission. This means you'll be able to effectively generate new geometry, modify existing geometry, and so on within some generally sane limits.
You'll be able to render into a texture like you currently can with the RenderTexture asset and a camera. This functionality already exists today.
You'll have access to common buffers we'd use in other parts of the pipeline - for example depth and normals. You'll also have access to other information such as tile ID (since the new renderer is a tile based forward renderer), and lights assigned to the current tile for that fragment.
Things like grab passes - that's a big maybe due to the performance of grab passes. There aren't many "fast" ways to do those. What we may do is just cycle the framebuffer back through as needed and rely on render ordering to make it look right, and enable stacking of "grab pass" shaders and objects.
In effect ProtoFlux -> WGSL intermediary -> SPIR-V/MSL (when we support Apple that is). The closest you'll ever get for raw shading language support, if we ever allow it, is WGSL snippets similar to Unreal Engine's HLSL snippets. That's not a promise that you'll ever have access to that - we'll be evaluating that one over time.
You'd be correct that there will be some ProtoFlux nodes that only work on the CPU, but most nodes on the GPU will also work on the CPU. It's mainly specific stuff like writing to a fragment output that won't work on the CPU and will result in an error when you set everything up. Same goes for things like uniform inputs and the like.
Is your feature request related to a problem? Please describe.
Resonite offers a fixed set of shaders for building visual content. While it's possible to achieve wide variety effects with those, particularly scripting them with ProtoFlux and other mechanisms, there is a wide array of effects that cannot be achieved.
Describe the solution you'd like
ProtoFlux has been designed to support wide variety of scenarios and "runtimes". Once we have a custom rendering engine, we will implement a shader runtime, which will be able to generate shader code from the visual node setups.
The specifics on how the shaders will be modeled (particularly how it will fit into lighting models and so on) will be determined closer to when this feature is worked on. One of the primary design goals however is long term compatibility of shaders created with ProtoFlux - as we make changes to the rendering pipeline in the future, we need to retain ability to regenerate the shader code to match the changes in the rendering pipeline and lighting model - in other words, prevent the engine upgrades from breaking shaders.
Whatever form, the library of nodes will include nodes to make it easy to setup common shading types, such as PBS or cartoon shaders, allowing users to make modifications to those or only change what inputs are fed into those models, without having to recreate the PBS shading model themselves with the shader.
Thanks for ProtoFlux's node overloading ability, any pieces of code (including nested nodes) that only use nodes that exist both in execution runtime and shader runtime, will be usable in both, allowing for sharing code between the runtimes as well.
Describe alternatives you've considered
We have considered using the model that some other platforms use, where shaders are built inside the Unity Editor. However this has a number of issues:
Additional Context
Currently this feature is CURRENTLY BLOCKED by our dependency on Unity, which does not allow for compiling and uploading shaders at runtime in any sane way.
Before this feature is worked on, we will need to complete a switch to our own rendering engine. I have created this issue to make sure this is properly communicated.