PAHFIT / pahfit

Model Decomposition for Near- to Mid-Infrared Spectroscopy of Astronomical Sources
https://pahfit.readthedocs.io/
19 stars 26 forks source link

PAHFIT Model Component Extraction #19

Closed jdtsmith closed 2 years ago

jdtsmith commented 6 years ago

One of the most useful (and under-used) features of the original PAHFIT is the pahfit_components function, which looks like this:

function pahfit_components,lambda,decoded_params, $
                           DUST_CONTINUUM=dust_continuum, $
                           TOTAL_DUST_CONTINUUM=dc_tot,STARLIGHT=stars, $
                           TOTAL_CONTINUUM=tot_cont, $
                           DUST_FEATURES=dust_features, $
                           TOTAL_DUST_FEATURES=df_tot, $
                           LINES=lines,TOTAL_LINES=lines_tot, $
                           EXTINCTION_FAC=ext, $
                           EXTINCTION_CURVE=ext_curve, $
                           DISABLE_EXTINCTION=de,LAMBDA_REST=lam_rest

This code allows you to take a decoded_params structure, which is output from a PAHFIT run, and selectively pull pieces from it: the total continuum, all due features, each features separately, etc. This is used to produce the canonical multi-colored plots we all known and love, but it's also essential for using model results more deeply than just examining the parameters output. We need something similar.

karllark commented 6 years ago

One option that is currently in the prototype base class is that each model (e.g., bb_cont, h2_model, etc.) is saved in the class as a separate compound model. Will need code to update the parameters after the fitting is done, but otherwise you can just use the separate component models for this. Will need to provide an example of this (docs!).

jdtsmith commented 6 years ago

Interesting. But for many components, you need to blend them together to be useful. For example: total dust continuum (vs. the per-T continua, which are not very physically meaningful).

Two separate but related parts of original PAHFIT pertain to this:

;+
; NAME:
;
;   PAHFIT_FEATURE_STRENGTH_COMBINE
;
; PURPOSE:
;
;   Combine feature strengths as produced by PAHFIT, taking care of
;   covariance among the features being combined when computing the
;   uncertainty.
;+
; NAME:
;
;    PAHFIT_MAIN_FEATURE_POWER
;
; PURPOSE:
;
;    Compute the power of the main dust features from a decoded PAHFIT
;    structure.

These together combine PAH band strengths for PAH "Complexes". Since they are all correlated, propagating errors is tricky.

karllark commented 6 years ago

Good point.

jdtsmith commented 5 years ago

In our discussion of flexible input formats I’m reminded we need a general approach to correlated uncertainties. Worth discussing at some point.

karllark commented 5 years ago

Correlated uncertainties are a good topic.

Do you mean on the input spectra? If so, we could use mulit-variate Gaussians and a large covariance matrix (# wavelength by # wavelength) to give the correlated observational uncertainties. The likelihood function can then directly use the covariance matrix in the chisqr calculation (I've done this in other contexts - e.g., the BEAST).

Or do you mean on the fitted model parameters? MCMC can capture this information. And we could assume multi-variate Gaussians again to give a measure of the covariance between parameters. Although the full sampler chains could also be saved to give the full correlations between parameters (w/o assumptions on the functional form).

jdtsmith commented 5 years ago

I mean correlated uncertainties for the fitted parameters. So that if you want to add 5 parameters together, but they are 100% correlated, you don't add their uncertainties in quadrature. In IDL PAHFIT I had this hard coded for the case of combining feature powers for PAH Complexes, but if there is a more general approach, I'd be in favor of building it in.

els1 commented 5 years ago

I agree that such functionality is very useful and I'd be in favour to include it.