I think we can mathematically prove that the values coming out of GetShaderVariantKey_o_stuff are 0 or 1.
Because it's and'd with 1.
On the C++ counterpart of the option management system, we see a lot of this:
EncodeBits(group.GetShaderVariantKey(), valueIndex.GetIndex() - m_minValue.GetIndex());
Which hints that the min is NOT encoded in the key bits. Which is logical since we are doing compression.
But, if we do so, we need to add the min at reconstruction, otherwise the contract about the range is not respected, we destroy the values by flooring them back to a 0-based range.
I believe this is a bug, even though I have not attempted an empirical (observed) reproduction yet.
While investigating https://github.com/o3de/o3de/issues/13625 I stumbled into the possibility that option with min != 0 never really having worked ever.
This is my basis for the claim: Take this input program:
The getter functions look like this:
I think we can mathematically prove that the values coming out of
GetShaderVariantKey_o_stuff
are 0 or 1. Because it's and'd with1
. On the C++ counterpart of the option management system, we see a lot of this:EncodeBits(group.GetShaderVariantKey(), valueIndex.GetIndex() - m_minValue.GetIndex());
Which hints that the
min
is NOT encoded in the key bits. Which is logical since we are doing compression. But, if we do so, we need to add themin
at reconstruction, otherwise the contract about the range is not respected, we destroy the values by flooring them back to a 0-based range. I believe this is a bug, even though I have not attempted an empirical (observed) reproduction yet.