Closed gabrielastro closed 1 week ago
plot_spectrum()
after a fit with ism_red
that yielded a best value below unity. The output contained:
[…]/species/phot/syn_phot.py:227: UserWarning: Calculation of the mean flux for 2MASS/2MASS.H is not possible because the wavelength array is empty. Returning a NaN for the flux.
The filter profile of 2MASS/2MASS.H (1.4180-1.8710) lies outside the wavelength range of the spectrum. The returned synthetic flux is therefore set to NaN
[…]
Residuals (sigma):
[…]/species/plot/plot_spectrum.py:1299 […] -> 1299 max_tmp = np.max(np.abs(residuals.photometry[item][1][finite])) […]
ValueError: zero-size array to reduction operation maximum which has no identity
Now the "out of range" warning was because the model was being stored only up to 0.87 µm or so (maybe printing the spectrum range in the message "lies outside the wavelength range of the spectrum" would be helpful for debugging in the future, by the way), but just from reading the line the residuals array does contain at least one finite value (the SDSS ones), so `residuals.photometry[item][1][finite]` should not be zero-sized, no? I might be misinterpreting the error message, though. Mentioning just in case this is an issue too.
Sorry, I should have guessed that a polynomial could become negative… For R_v < 0, indeed the Cardelli et al. (1989) extinction becomes negative at some wavelengths: Therefore this was not buggy behaviour but effectively bad input on my part! My apologies.
The points above still remain, though. Thanks again!
data_folder
on an external hard drive 😊get_wavelengths()
from ReadModel
species
behaves perfectly, even when I set 'ism_ext':-2, 'ism_red': 0.3
(testing with read_model.get_model()
and with fitting). So, thank you for how it is now, including the note in the documentation!There is no extinction applied for wavelengths outside the range, which requires indeed some clarification. Or I could fix it to the extinction at the last wavelength, but I am not sure if that would be better...
Hmm… Are there measurements of the ISM opacity below 0.125 µm in the literature? Or one could "just" reproduce the ISM curve with OpTool
for instance and then compute the actual extinction down to smaller wavelengths. If the optical constants have been measured there… At larger wavelengths, Chiar & Thielens (2006) could be used up to 8 µm, and there must be data at longer wavelengths. Also, maybe Wang & Chen (2019) could be added but this is a separate thing.
For the below 0.125 um it doesn't matter since there aren't any measurements of planets/stars, but for the MIR wavelengths it may impact fit results, e.g. when including L/M band photometry. I will leave it as it is, but should implement a more generic extinction approach at some point, that allows for other relations.
Ok! Just for safety for plotting, fix the extinction especially at low wavelengths to the extrapolation, and document this anyway just for clarity?
Three somewhat linked problems:
'ism_red'
to less than 1.0 in a model parameter dictionary for amodel_box
, the wavelength of the model gets very restricted (up to ca. 0.8 µm but not sure how universal this is), which seems buggy, and then leads to problems when computing residuals, for instance.According to the example in the documentation ("for example
bounds={'ism_ext': (0., 10.), 'ism_red': (0., 20.)}
") and from a quick look at the formula, RV < 1 should not be a problem mathematically [Edit: uh, it actually will! See below], so I do not know what is happening.[x] It might be good to warn the user or throw an error if he or she sets
ism_red
(R_V) but notism_ext
(AV); logically, AV=0 is the default, so in practice it does not matter, but it might indicate confusion on the user side, which can maybe easily occur because of the number of parameters. When mini-debugging to prepare this "issue", I was not getting the problem in 1.) when not settingism_ext
, which confused me at first.Maybe you noticed that I set the upper wavelength to 39.0 µm. The reason is that the spectrum files go up not to 40 µm as claimed in
model_data.json
but to 3.999999999999999289e+01 :slightly_smiling_face:. A few points here: ~- Some wavelength minima too are affected, by the way, but not for all grids. Maybe a hot fix for now would be to limit the ranges inmodel_data.json
(e.g. starting at 0.601 or ending at 39.9 µm), before you remake all the grids affected (to make them match.json
)~[x] Now about solving the problem: Since we are working with doubles, the last three digits are actually meaningless. Secondly, there are way too many significant digits, leading to files that are really a factor of several too large—and we are talking hundreds of MBs to GBs, not the inelegance of e.g. 7.1 kB vs. 1.7 kB :nerd_face:. I know we were discussing this a bit somewhere else but certainly five or six significant digits cannot not be enough. Many people have big hard drives but many are a bit limited, and there is really no gain in having huge files for nothing. There are grids that I could maybe run on my laptop, which would be much more convenient, if they were smaller. (The RAM is a separate issue, of course.)
Thanks a lot!