PAHFIT / pahfit

Model Decomposition for Near- to Mid-Infrared Spectroscopy of Astronomical Sources
https://pahfit.readthedocs.io/
18 stars 26 forks source link

Trimmed model tool and PAHFITBase changes #165

Closed drvdputt closed 2 years ago

drvdputt commented 2 years ago

The part that creates param_info (originally part of read()) has been split off into a new function called parse_table().

Functionality of read() has not changed, so nothing should break by doing this.

Main use case for parse_table() is adjusting science packs on-the-fly in Python:

  1. Read in science pack into astropy Table
  2. Edit Table (e.g. remove lines based on wavelength range)
  3. Create param_info using parse_table()
  4. Continue as usual, by passing the customized param_info to PAHFITBase

I already have a tool which makes use of this, to create "trimmed" PAHFITBase models based the wavelength range of the input data. It could be added to PAHFIT later, so that science packs do not have to be adjusted when only one spectral segment needs to be fit. Will also be relevant for #158.

Other small changes: read() and plot() are now static methods.

drvdputt commented 2 years ago

I have added the model trimming tool mentioned above to helpers.py, in another branch. There are some more changes to parse_table too, since it crashed if there were 0 Gaussians or Drudes in the table.

Should I push those changes too?

karllark commented 2 years ago

I have added the model trimming tool mentioned above to helpers.py, in another branch. There are some more changes to parse_table too, since it crashed if there were 0 Gaussians or Drudes in the table.

Should I push those changes too?

Yes please.

drvdputt commented 2 years ago

By trying out my new tool, I found some ways in which PAHFIT can crash, for certain science pack configurations.

The main failure mode is when there are zero entries of a certain feature type in the pack file, e.g. no Drude1D, no Gaussian1D. This also happens when there are e.g. no dust features in the given wavelength range, and the trimming tool consequently removes all Drude1D.

Is this a use case we want to support? The crash happens at model construction, and fixing it should be straightforward at least for that step. But of course, there might be some other hiccups in other parts of the program (e.g. feature strengths, plotting...).

Writing a set of tests for this would be straightforward I think. Just create some science packs where all the features of one type are removed.

jdtsmith commented 2 years ago

@drvdputt my MBB PR #176 should fix the crashing behavior you observed. In terms of this PR, I like the concept of on-the-fly adjustments (which will be necessary for our active tuning during on JWST spectra). But given our joint interest in moving to a more flexible & composable "pack" setup (instrument (PAHFit.packs.instrument class) + science (.yaml files)), perhaps we should discuss these changes in that context? We can chat about it Monday.

drvdputt commented 2 years ago

I have written a test (see pahfit/tests/test_model_trimming) which runs the construction of PAHFITBase (parse_table() and __init__()) with different model components removed from the table. This covers: No blackbodies, no Gaussians, No Drudes, and No Drude attenuation.

Then, I fixed the crashes that this caused. #176 already added safeguards against param_info[i] is None in __init__(). But they did not work yet, because param_info was actually never set to None. Therefore, parse_table will now set the components of the param_info tuple to None when applicable. Following this addition I added similar safeguards to estimate_init.

The has stopped the crashing, at least for the construction of PAHFITBase. We can address any other issues down the line when people (me) actually start using the models with Blackbody/Gaussian/Drude removed.

So this is close to mergeable.

jdtsmith commented 2 years ago

I have written a test (see pahfit/tests/test_model_trimming) which runs the construction of PAHFITBase (parse_table() and __init__()) with different model components removed from the table. This covers: No blackbodies, no Gaussians, No Drudes, and No Drude attenuation.

Probably fine to merge now, but just be aware some details of the features table will be changing slightly, so the tests which target it will need a few updates. Hopefully nothing too drastic.

[Edit] BTW, here's the nomenclature I'm suggesting we adopt. So the new table-thingy will be a feature table, and each feature (like say a [NeII] line) will have various parameters (like wavelength, fwhm, power), and each parameter will have attributes (like value, bounds).

drvdputt commented 2 years ago

Ok, I would like to get this merged today. I will rebase or merge to integrate the latest changes. We can then adjust this code to the new YAML science packs whenever that system is put to use.

jdtsmith commented 2 years ago

Going to merge this now, on the notion that @drvdputt will help @Ameek-Sidhu with the connecting in of YAML-based Features table, and moving, fit/read/etc. from helpers into the model object.