Open starwing opened 10 years ago
Quick note on implementation: this is best handled purely using the shader, with blending mode set to glBlendFunc( GL_ONE, GL_SRC_ALPHA). The shader can then take a constant parameter to calculate the required color and alpha outputs so as to get the blend mode required. This would then decrease the state changes needed.
A requirement came up today which would make purely additive blending highly desirable: for procedural textures, I need to render roughness / metallic / emissive / weight parameters into RGBA channels separately; (Not sure how I would actually write to A exclusively, now that I think about it. Any ideas?)
Although I could probably render to 4 seperate textures in a pre-step and then mix those into a single picture with an extra shader. That's probably the cleanest way to do it.
I think this could be done by modifying the code to take a five component colour (RGBXA) and using pure additive blending with pre-multiplied alpha, so outputing RGBX * A.
so:
RGBXd = RGBXd + RGBXs * As
if that's what you're asking for?
I don't think this is general enough to add to the library though - but it could be done with a custom shader such as in https://github.com/memononen/nanovg/issues/57.
Finally, I implemented this feature. The composite operation works between nanovg frames. Just opened a pull request. https://github.com/memononen/nanovg/pull/298 This PR also solves my another issue: https://github.com/memononen/nanovg/issues/88
Hi,
I hope NanoVG could add compositing support. it means, you can set how pixel composition with destination pixel. In HTML5 canvas, these compositing mode are used:
see here
A function:
can be added, and add a field to
NVGparams
, backend can set correct mode before any drawing.