Open drvdputt opened 6 years ago
Is there a way to tune the normalization constant based on the flux values (w/ dust and bias) in the full model grid? Automatically try different ones and pick the one that produces the "flatest" histogram?
The symlog normalization constant is typically chosen such that the derivative stays smooth between the linear and non-linear regimes. But I agree, it's inconvenient that most of the fluxes are landing in the linear regime. Somehow scaling the fluxes (either by an arbitrary constant, or changing their units, or something) to get their values larger would probably help.
I've created a function to split bins for symlog https://gist.github.com/artoby/0bcf790cfebed5805fbbb6a9853fe5d5
I've created a function to split bins for symlog https://gist.github.com/artoby/0bcf790cfebed5805fbbb6a9853fe5d5
@artoby - can be add a bit more info on how what you've done helps solve this issue? I am not seeing the connection and this it likely because I'm missing something obvious. Thanks in advance.
See for example these 1d pdfs of the phat small example, for the symlog of the F814W flux.
Almost everything falls in the leftmost bins (lowest symlog).
I think the following happens: normally we take the log, and then put the bins on a linear scale, so the bins represent log flux. Now we take the symlog, but because the flux values are all smaller than 1, we are still in the linear regime of log(1+|x|), which give trouble because of the dynamic range of the fluxes.
So I guess we need some normalization constant C in there, which takes us out of the linear regime: log(1+|x|/C).