Open karllark opened 7 years ago
The basic method is to do a weighted average of the model dust extinguished spectra for each star that has been fit. The weighting is provided by the likelihood that each model spectrum fits the photometric data via the "lnp" file. The "lnp" file is the saved version of the likelihood function and is only a sparse sampling to save disk space. This sparse sampling should be good enough as it should provide a good approximation. This will have to be tested for a few stars by re-running the BEAST and outputting the full lnp file.
The model spectra will need to be created on-the-fly from the saved grid of model dust-free spectra and the dust extinction fit parameters and BEAST dust extinction code. These model spectra only need to be calculated for the models in the sparse lnp file (many, many fewer than the full model grid).
I can provide the model dust-free spectral grid and sparse lnp files for a few stars for testing. If these stars were stars where we have ground-based spectral observations, that would make them more interesting.
The code for this task will be a stand alone tool that should be put into the tools subdirectory. And the code should be written to run in python 3 (and python 2.7 for compatibility - but not the 1st priority).
@eteq : might be of interest to you. This would be a start of having a beast version that includes spectra in the fitting.
It would be useful to have code that would use the saved sparse lnp files to generate a model spectrum+range or model spectra for the BEAST fits. This would allow comparisons from the BEAST fits to existing spectra (SPLASH). And predictions of spectra at all wavelengths for other purposes including predicting fluxes in bands not included during the BEAST fits themselves.