PAHFIT / pahfit

Model Decomposition for Near- to Mid-Infrared Spectroscopy of Astronomical Sources
https://pahfit.readthedocs.io/
18 stars 26 forks source link

Update base.py for YAML-based sci-pack fitting with astropy.model #222

Closed Ameek-Sidhu closed 1 year ago

Ameek-Sidhu commented 2 years ago

Updated the read() and parse_table() functions to properly read in the yaml file.

Ameek-Sidhu commented 2 years ago

@jdtsmith, Can you please have a look at the base.py file? This PR is no way complete. I just want your review to be sure that I am in the right direction. Once, I have the instrument pack corresponding to the classic.yml, I will update the code to incorporate the fwhms of the lines. Following that, I will move the initialize_model and fit_spectrum methods from helper.py into base.py. What do you think?

jdtsmith commented 2 years ago

The key I think for this PR is to get it testing correctly with classic.yml and an IRS instrument pack. I'll work on the latter if you and @drvdputt can get things setup for the former.

jdtsmith commented 2 years ago

OK, see #226 for an IRS instrument pack. Test with this PR to see how it performs?

Ameek-Sidhu commented 2 years ago

This PR can now read in the yaml based science pack and the instrument pack. However, it will only work for the instrument segments in the iso.yaml and spitzer.yaml and not for the stitched spectrum. @jdtsmith Can you test it with the irs segment spectrum you have?

Since, I updated the run_pahfit.py as well, the following command is now used to run PAHFIT run_pahfit spectrumfile sciencepackfile instrumentname

Ameek-Sidhu commented 2 years ago

Attached plots are for M101 and NGC 4051 in the sl2 range. These are fit using the classic.yaml.

M101_Nucleus_irs_cut galaxy_spitzer_sl_2
jdtsmith commented 2 years ago

Attached plots are for M101 and NGC 4051 in the sl2 range. These are fit using the classic.yaml.

Great. Did you include redshift in the fit?

Ameek-Sidhu commented 2 years ago

Attached plots are for M101 and NGC 4051 in the sl2 range. These are fit using the classic.yaml.

Great. Did you include redshift in the fit?

No, I did not correct the spectra for redshift. I just wanted to test if the yaml file is read correctly or not.

drvdputt commented 2 years ago

Just a caveat I would like to write down here: The fitting output is still written to the old format. This will break the ability to read in the output file as the starting point for a model. To solve this, the new setup should take either a YAML file or a table file. But this will only work if the table file is in the exact same format as the output of the YAML-to-table conversion.

In the IPAC table version of the code, the function PAHFITBase.save() goes over all the components, and painstakingly puts it together in the right format. So this function should be adjusted.

If, under the new class/API concept, we store both a model object and the table, and we keep them in sync, then this will be trivial. But we would still need a similar function to update that table.

jdtsmith commented 2 years ago

Definitely need to round-trip input and output of fitting parameters to the Features table. Remember that any line in the table which has a fwhm set on input should also be updated on output (but of course not for any fixed or absent line fwhm's). Also some of the tests are failing, presumably because the IPAC table is hard-coded.

drvdputt commented 2 years ago

I was able to fix the test that I wrote, where it is simply made sure that the table parser doesn't crash if there's no bb's, no gaussians etc. However, with the other tests, there are problems.

  1. They depend on the spitzer science packs, for which we don't have yaml files yet
  2. They load in a merged spectrum, for which we don't have instrument models yet
  3. The test for the classic pack, loads in the pack from a file, and also constructs one in a hardcoded manner. Since the new code is adjusting the line widths, the assertions at the end fail. Also, the new code does not allow the obsdata to be outside of the instrument wavelength range. But the test data covers the range of a merged spectrum.
jdtsmith commented 2 years ago

Please see PR #229 for an attempt to facilitate fitting of stitched spectra by allowing multiple instruments to be specified per input spectrum, including a convenient glob-style.

Feel free to test together with this PR.

jdtsmith commented 2 years ago

Thanks @drvdputt for working on the tests.

They depend on the spitzer science packs, for which we don't have yaml files yet

Maybe I'm missing something, but the classic.yaml pack supersedes this, and exists now.

They load in a merged spectrum, for which we don't have instrument models yet

See #229 above (untested with this PR as of yet... @jancami if you are looking for something ;).

The test for the classic pack, loads in the pack from a file, and also constructs one in a hardcoded manner. Since the new code is adjusting the line widths, the assertions at the end fail.

Makes sense. Perhaps we can convert this to test a single hand-input FWHM, which should be preserved on output.

Also, the new code does not allow the obsdata to be outside of the instrument wavelength range. But the test data covers the range of a merged spectrum.

Let's fix this with #229. But there's no need to throw an error: simply trim out lines that aren't within range of a segment. In newmodel I'm using 2.25 FWHM's each side as the "impact" range for the gaussians.

jdtsmith commented 1 year ago

Hi @Ameek-Sidhu, can you roll back your most recent changes to the last clean state and re-target this PR to the new dev branch named "dev_yaml-scipack+api"? Then @drvdputt can submit his API PR to this dev branch, and on top of that we can figure out how to conveniently populate the Features table on guess/fit return.

jdtsmith commented 1 year ago

You don't want to merge that dev branch here, you want to re-target the PR to the new dev branch.

jdtsmith commented 1 year ago

Let's go ahead and merge this into the dev trunk, with the assumption that @drvdputt will pick up the pieces in his API cleanup PR.