Closed jlstevens closed 9 years ago
Unless I'm missing something fundamental, there's no need for a standard_deviation
parameter, if we already have a scale
parameter. Doesn't multiplying a zero-mean Gaussian by a scalar change its variance? The scale
and offset
parameters are shared by all PatternGenerators, and so those are the right parameter names to use for any pattern for which they are meaningful, including this one. You may want to add a note to the docstring explaining this, though.
That's right..I remember now!
If you have an offset such that the output is mostly positive, multiplying by a scalar isn't the same as increasing the spread. The only case when the scale parameter does match the standard deviation is when the distribution has zero mean.
As you noticed, I was a rather grumpy when filing this issue - looking at all the code in ImaGen that I really want to improve but can't fix sometimes does that to me!
Right. I now agree that it does work because scale is applied before offset.
That said, the fact that the scale matches the standard deviation (but not the variance!) is pretty far from evident, mainly because there is no useful documentation given with the scale parameter.
Ok, I've deleted the crazy comments we were making in the middle there, leaving only the bits that make sense, to avoid spreading confusion. :-) But yes, it does need a better docstring, probably for the scale parameter itself.
What is the point of a Gaussian distribution is I can't change the standard deviation or variance?!?
Here is the current definition:
Why would you do this when you can make something useful?
That wasn't so hard now, was it? I can't fathom the logic behind the original implementation; why use
standard_normal
whennormal
is more general and actually useful?!Anyway, I am happy to commit the sane version to master except that I am now worried about breaking backward compatibility. I suppose I need to see whether
.standard_normal
behaves like,normal
with unit standard deviation.