Hello! Thanks for all your work on this project, it's a great resource for the community.
I'm opening this pull request to share an addition to the package that we think could be beneficial to others, but we also understand if it's not something you're interested in adding to the official project. It's a fairly large feature, but almost all of the code is duplicated from pre-existing parts of the codebase, so I'm hoping it's fairly easy to review should you choose to do so.
Motivation
I am interested in fitting for the stellar intensity profile simultaneously alongside the actual parameters of interest when analyzing transit data. The overall goal is to remain as Bayesian as possible and to propagate our uncertainty in the stellar intensity profile through to the final results. To do this, I could fit for the coefficients of a given limb darkening law directly, either with an "uninformative" prior (e.g. the Kipping 2013 $q$-parameterization), or with a more complicated prior derived from stellar models (maybe a KDE based on many profiles derived from sampling the stellar uncertainties and using StellarLimbDarkening? Maybe using the covariances returned by any of the StellarLimbDarkening fitting methods?). This is a little clunky though, and leaves me with the possibility that I'll explore a region of parameter space that is not physical.
Alternatively, instead of fitting for the limb-darkening coefficients themselves, I could fit for the physical parameters that underpin them: the star's effective temperature, surface gravity, and metalicity. These often have well-constrained uncertainties from previous measurements, and working in this parameterization guarantees that each resulting coefficient vector corresponds to a physically plausible stellar model.
I could use StellarLimbDarkening to do this, and simply compute a new set of limb-darkening coefficients for every likelihood evaluation. However, since StellarLimbDarkening needs to open a saved spectra file and integrate over the provided bandpass before it gets to actually fitting for the coefficients, this is a little slow. When considering transit models like batman or jaxoplanet, computing the limb-darkening coefficients this way takes much longer than a single transit model evaluation, meaning the limb-darkening computation is the bottleneck in an MCMC or nested sampling run.
For example, the following snippet simulates how long it might take to compute 100 samples of limb-darkening coefficients in something like an MCMC run:
This takes 1.36s on my laptop, meaning before we compute the actual transit model, each evaluation already takes >10 ms.
Proposed addition
I've added a new class, PrecomputedLimbDarkening, which is very similar to StellarLimbDarkening but differs in one key way: before computing any coefficients, it first loops through every stellar model in a given grid and pre-computes the $I(\mu)$ profile for a given bandpass. It then keeps these profiles in memory, so that when asked to compute coefficients for a new set of stellar parameters, it can avoid any I/O and the expensive integration step. That means each evaluation is much faster, and the bottleneck is now the transit model evaluation itself:
This whole snippet now takes 77 ms on my laptop, meaning each sample now costs <1 ms.
In addition to the PrecomputedLimbDarkening, I've also included a new file in tests to verify that the new class is working as expected, and a new tutorial notebook to demonstrate how to use it.
Misc notes
When using interpolation_type="nearest", the results of PrecomputedLimbDarkening are identical to those of StellarLimbDarkening. However, when using interpolation_type="trilinear", the results will be slightly different, since StellarLimbDarkening interpolates across the stellar spectra themselves, while PrecomputedLimbDarkening interpolates across the pre-computed $I(\mu)$ profiles.
Each time a grid is pre-computed, it is stored in the local directory alongside the spectra and throughput files.
Since the phoenix and mps grids are quite large, by default each spectra is deleted after its $I(\mu)$ profile is computed for the provided bandpass. This can be changed by setting save_stellar_grid=True, which will save the grid to the local directory as usual.
Lastly, to provide an example of why this Bayesian approach might be useful for some science cases, here is a figure from the tutorial notebook that illustrates how a light curve might be affected by uncertainty in the stellar parameters:
The effect of a slight temperature change is clearly small, but comparable in magnitude to (and unfortunately at the same transit phase as) the expected effect of planetary oblateness, which in principle is now a measurable with facilities like JWST.
Thanks again for this package, and for considering this addition! I'm happy to make any changes you think are necessary, and am looking forward to hearing your thoughts.
Hello! Thanks for all your work on this project, it's a great resource for the community.
I'm opening this pull request to share an addition to the package that we think could be beneficial to others, but we also understand if it's not something you're interested in adding to the official project. It's a fairly large feature, but almost all of the code is duplicated from pre-existing parts of the codebase, so I'm hoping it's fairly easy to review should you choose to do so.
Motivation
I am interested in fitting for the stellar intensity profile simultaneously alongside the actual parameters of interest when analyzing transit data. The overall goal is to remain as Bayesian as possible and to propagate our uncertainty in the stellar intensity profile through to the final results. To do this, I could fit for the coefficients of a given limb darkening law directly, either with an "uninformative" prior (e.g. the Kipping 2013 $q$-parameterization), or with a more complicated prior derived from stellar models (maybe a KDE based on many profiles derived from sampling the stellar uncertainties and using
StellarLimbDarkening
? Maybe using the covariances returned by any of theStellarLimbDarkening
fitting methods?). This is a little clunky though, and leaves me with the possibility that I'll explore a region of parameter space that is not physical.Alternatively, instead of fitting for the limb-darkening coefficients themselves, I could fit for the physical parameters that underpin them: the star's effective temperature, surface gravity, and metalicity. These often have well-constrained uncertainties from previous measurements, and working in this parameterization guarantees that each resulting coefficient vector corresponds to a physically plausible stellar model.
I could use
StellarLimbDarkening
to do this, and simply compute a new set of limb-darkening coefficients for every likelihood evaluation. However, sinceStellarLimbDarkening
needs to open a saved spectra file and integrate over the provided bandpass before it gets to actually fitting for the coefficients, this is a little slow. When considering transit models likebatman
orjaxoplanet
, computing the limb-darkening coefficients this way takes much longer than a single transit model evaluation, meaning the limb-darkening computation is the bottleneck in an MCMC or nested sampling run.For example, the following snippet simulates how long it might take to compute 100 samples of limb-darkening coefficients in something like an MCMC run:
This takes 1.36s on my laptop, meaning before we compute the actual transit model, each evaluation already takes >10 ms.
Proposed addition
I've added a new class,
PrecomputedLimbDarkening
, which is very similar toStellarLimbDarkening
but differs in one key way: before computing any coefficients, it first loops through every stellar model in a given grid and pre-computes the $I(\mu)$ profile for a given bandpass. It then keeps these profiles in memory, so that when asked to compute coefficients for a new set of stellar parameters, it can avoid any I/O and the expensive integration step. That means each evaluation is much faster, and the bottleneck is now the transit model evaluation itself:This whole snippet now takes 77 ms on my laptop, meaning each sample now costs <1 ms.
In addition to the
PrecomputedLimbDarkening
, I've also included a new file intests
to verify that the new class is working as expected, and a new tutorial notebook to demonstrate how to use it.Misc notes
When using
interpolation_type="nearest"
, the results ofPrecomputedLimbDarkening
are identical to those ofStellarLimbDarkening
. However, when usinginterpolation_type="trilinear"
, the results will be slightly different, sinceStellarLimbDarkening
interpolates across the stellar spectra themselves, whilePrecomputedLimbDarkening
interpolates across the pre-computed $I(\mu)$ profiles.Each time a grid is pre-computed, it is stored in the local directory alongside the spectra and throughput files.
Since the phoenix and mps grids are quite large, by default each spectra is deleted after its $I(\mu)$ profile is computed for the provided bandpass. This can be changed by setting
save_stellar_grid=True
, which will save the grid to the local directory as usual.Lastly, to provide an example of why this Bayesian approach might be useful for some science cases, here is a figure from the tutorial notebook that illustrates how a light curve might be affected by uncertainty in the stellar parameters:
The effect of a slight temperature change is clearly small, but comparable in magnitude to (and unfortunately at the same transit phase as) the expected effect of planetary oblateness, which in principle is now a measurable with facilities like JWST.
Thanks again for this package, and for considering this addition! I'm happy to make any changes you think are necessary, and am looking forward to hearing your thoughts.