godotengine / godot-proposals

Godot Improvement Proposals (GIPs)
MIT License
1.12k stars 69 forks source link

Add "Alpha from GrayScale" texture import setting #5247

Open Exyde opened 2 years ago

Exyde commented 2 years ago

Describe the project you are working on

I'm working on a heavily VFX and Particles drived project

Describe the problem or limitation you are having in your project

I want to use my grayscale texture as the transparency of my particle, but I can't without using a custom shader.

Describe the feature / enhancement and how it helps to overcome the problem or limitation

A simple tickbox to choose to use grayscale as alpha or not

Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams

There should be a boolean in the flags of the texture import settings named "Alpha from Grayscale"

If this enhancement will not be used often, can it be worked around with a few lines of script?

It can be worked around using a custom shader (spatial or canvas_item) like this

shader_type canvas_item;

uniform float _t = 0.5;

void fragment(){
    vec4 tex = texture(TEXTURE, UV);
    if (tex.r < _t && tex.g < _t && tex.b < _t){
        discard;
    }
    COLOR = tex;
}

Is there a reason why this should be core and not an add-on in the asset library?

In my opinion it should be core because every project is using texture assets

Calinou commented 2 years ago

Out of curiosity, why not edit the image's alpha channel in an image editor instead? GIMP has a Color To Alpha filter for this (which Photoshop lacks a built-in equivalent of, unfortunately). If you set it to make black fully transparent on a grayscale image, then it will effectively convert the image's value to an alpha channel.

The only use case I can see for writing a grayscale image's value to an alpha channel on import is for JPEG images, which can't store transparency. That said, it's unlikely that you'll be using JPEG images for VFX (especially for a transparency mask).

Exyde commented 2 years ago

It's faster in the workflow to be able to tick in and out a property inside the game engine instead of opening an external software, apply a filter, and re-export the texture I used to play a lot with different texture properties when I'm blocking out a VFX or searching for something specific, so it's a powerfull tool in the creative process :) !

aXu-AP commented 2 years ago

Maybe this could be implemented as an editor import plugin? Haven't tried to make one, but sounds feasible.

Exyde commented 2 years ago

I don't know about Editor Import Plugin yet, but I trust you if you say so ^^

aXu-AP commented 2 years ago

I can't promise anything yet, but I could try making a simple plugin. However, can you provide some concrete examples how this would be used? For me it seems quite specific use case. Usually when brighter areas should be more visible, additive blend mode is often desired solution (in which case transparency comes automatically). In other cases such workflow would easily bring unwanted artefacts (dark brim) or unnecessary limitations (what if I need dark areas in texture?).

FyiurAmron commented 7 months ago

@aXu-AP take a look at how smoke/fire/explosions etc. were handled in 3D games since at least Unreal 1 (probably even before that), and how Unity handles it, even inside it's core (company-provided) primer asset pack. Tl;dr there is assumption that the particle is overlaid on top of black background, and the intensity implies both how bright and how translucent the pixel will be. This generates a very distinct and realistic effect by just doing a frame-by-frame photography of the particle in near-total darkness. Also, it allows such assets to be manipulated easily. Our expectation is that the edges of the smoke/fire/explosion have lower intensity and are partially transparent, and that the center actually obstructs the visual field.

Sure, you can always calculate the alpha from grayscale yourself and add it in the external editor... which both increases the size of your assets, and adds lots of manual labour. OTOH, doing so in shader increases the amount of code and processing the shader needs to do (i.e. will affect FPS etc., even if so slightly)

image

So yes, this is "specific" in the sense it's mostly used for things that are mimicking lights, and for things that are often handled by particle systems. It's however not "specific" in the sense that it's uncommonly used or isn't an ordinary and well-known technique. It is.

Also, it's sometimes used for UI elements, especially those which are intended to be partially transparent by design, but still easy to recolour etc. , like https://lpc.opengameart.org/content/64-crosshairs-pack ; using additive blend will give different result to just having alpha channel autogenerated.

As to

(in which case transparency comes automatically)

... well, not really. The equation for additive blend doesn't give you alpha of any kind, it rather emulates light. E.g. you can't reach 100% intensity by normal alpha-blending white with 50% grey. However, if you do additive blend, you easily can. Same if you go with HDR/glow - selecting between alpha, addition and alpha-addition gives different results.

E.g. even if you have additive blend, you can still additionally multiply the input by alpha from greyscale, you actually get stronger visually results (the brightness gets applied quadratically, e.g. for pure greyscale, 1.0 input gives automatic 100% intensity on result, whereas 0.5 gets multiplied by 0.5 and only 0.25 is added), which is why this is often used for the artistic effect.

That aside, in some situations you just want to have alpha. It wasn't folly on Unity's side to implement it, it was and still is actually a quite standard feature I'd say.