rice-solar-physics / pydrad

Python tools for setting up HYDRAD runs and parsing output
https://pydrad.readthedocs.io
MIT License
4 stars 3 forks source link

Added function to calculate column emission measure #110

Closed jwreep closed 4 years ago

jwreep commented 4 years ago

In parse.py, in the Profile class, I've added a function to calculate the column emission measure for a profile snapshot.

from pydrad.parse import Strand
from matplotlib import pyplot as plot
s = Strand('/path/to/sim')
p = s[100]
EMc, bins = p.column_emission_measure()
plot.step(bins, EMc)
plot.yscale('log')
plot.show()

etc. The function uses a default binning range of log T = [3.0,8.0), in steps of dlog T = 0.05.

coveralls commented 4 years ago

Pull Request Test Coverage Report for Build 190


Changes Missing Coverage Covered Lines Changed/Added Lines %
pydrad/parse/parse.py 1 9 11.11%
<!-- Total: 1 9 11.11% -->
Totals Coverage Status
Change from base Build 184: -0.7%
Covered Lines: 311
Relevant Lines: 579

💛 - Coveralls
wtbarnes commented 4 years ago

So in this case, what is the assumed geometry? Is that the loop is entirely contained within a single pixel and oriented completely along the LOS?

jwreep commented 4 years ago

Yes, single pixel, and the LOS depth is set equivalent to the length of a grid cell. This can be thought of as being oriented along the LOS, or simply that we've set the loop width equivalent to the grid cell width (done in e.g. Bradshaw et al 2012). Alternatively, we could take the loop width or cross-section as an input and calculate it that way.

I would prefer to assume the loop fits in a pixel (spatially unresolved), or we would need to somehow specify pixel size. Not terribly hard to do, but requires a bit more work.

wtbarnes commented 4 years ago

I think a slightly more efficient way of doing this might be to use the numpy.histogram2d function and then sum along the spatial dimension, e.g.

weights = self.electron_density * self.ion_density * self.grid_widths
H, _, _ = np.histogram2d(self.grid_centers, self.electron_temperature, bins=(self.grid_edges, bins), weights=weights)
return bins, H.sum(axis=0)

This avoids a loop of the coordinates which could be potentially slow for very long loops or in cases where the resolution is really high.

wtbarnes commented 4 years ago

Yes, single pixel, and the LOS depth is set equivalent to the length of a grid cell. This can be thought of as being oriented along the LOS, or simply that we've set the loop width equivalent to the grid cell width (done in e.g. Bradshaw et al 2012). Alternatively, we could take the loop width or cross-section as an input and calculate it that way.

I would prefer to assume the loop fits in a pixel (spatially unresolved), or we would need to somehow specify pixel size. Not terribly hard to do, but requires a bit more work.

Ok that sounds good to me and I'm completely fine with that. It is still a really useful convenience function. I think we just need to make this clear in the docstring.

codecov[bot] commented 4 years ago

Codecov Report

Merging #110 into master will decrease coverage by 0.22%. The diff coverage is 28.57%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #110      +/-   ##
==========================================
- Coverage   43.64%   43.41%   -0.23%     
==========================================
  Files          10       10              
  Lines         456      463       +7     
==========================================
+ Hits          199      201       +2     
- Misses        257      262       +5     
Impacted Files Coverage Δ
pydrad/parse/parse.py 34.30% <28.57%> (-0.25%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update ac43514...70ff716. Read the comment docs.

jwreep commented 4 years ago

It's been updated. One minor annoyance: the EM and bin arrays that are returned have unequal length because of the final element of bins is interpreted as the right edge of the last bin. So to plot, you would need to use bins[:-1] or something. Are we okay with this?

wtbarnes commented 4 years ago

Yeah this is always the difficulty of dealing with histograms.

I kind of prefer including the rightmost edge as it means your bins are always full specified and can all be or arbitrary width. It also means you can just do a quick plot of the values versus the bin center positions ((bins[1:] + bins[:-1])/2).

We could also just make bins a required input (rather than an optional keyword argument). That way, we wouldn't have to worry about returning it. Of course, this would mean there was no default value which may be slightly annoying.

An annoyance I've always had with matplotlib is that, while it is easy to compute and plot a histogram, it is a pain to plot a histogram from a set of bins and values. I've been just dragging around this snippet of code with me every time I make a histogram,

def plot_hist(ax, vals, bins, **kwargs):
    """
    Plot a histogram from bin values

    Parameters
    ----------
    ax : Matplotlib axis
    vals : value in each bin
    bins : Bin edges, including the rightmost edge
    kwargs : Plotting keyword arguments
    """
    ymin = ax.get_ylim()[0]
    ax.step(bins[:-1], vals, where='post', **kwargs)
    if 'label' in kwargs:
        del kwargs['label']
    ax.step(bins[-2:], [vals[-1], vals[-1]], where='pre', **kwargs)
    ax.vlines(bins[0], ymin, vals[0], **kwargs)
    ax.vlines(bins[-1], ymin, vals[-1], **kwargs)

which makes sure to get the positioning of the bins correct and handles plotting the edges as well.

jwreep commented 4 years ago

I would prefer to have a default value for the binning. I noticed the same thing with Matplotlib when I was testing this.

Since the plotting can be a bit vexing, it might be à propos to introduce a peek_EM or some such function to make the quick look even easier. This could also be modified to optionally calculate a volume EM or differential EM, which are just as easy to calculate if we use similar assumptions.