Open teadrinker opened 8 months ago
The Blend operator causes odd behavior due to GetTextureSize being used in a way so that it only cares about one of the input images:
It's tempting to try to use context.RequestedResolution (like the new SetRequestedResolution), however I see 2 problems with this approach:
Thanks for the feedback. You're correct in that ideally all ops using [GetTextureSize] should publish this parameter to allow an override. In practice, I have to admit, I'm hesitant to implement everything that would be consistent because I'm afraid that that many operators would suffer from too many parameters. Also there is a small performance overhead by the added parameter.
On the other hand, I recently stumbled over the same issue. So I added [SetRequestedResultion] after our discussion on discord.
if I understand correctly from the video https://www.youtube.com/watch?v=f9E7lwUXfBM
for -1 use output resolution for 0, use resolution from source if available, fallback on output resolution (resolution flows from left to right) (else use the fixed width/height values)
Internally, this seemed to be handled through GetTextureSize, however, for instance inside CustomPixelShader, the texture input is not connected to GetTextureSize which breaks expected behaviour (is this a bug? or a deliberate exception?)
I feel the concept outlined in the video is great, but in order for the user to have control over resolution in this intuitive way, all operators that internally uses an operator with a resolution should also expose a resolution input and handle it consistently.