heliohackweek / mms_data_hunt

https://heliohackweek.github.io/mms_data_hunt/
1 stars 4 forks source link

Issue with get_fpi in memes/retrieve_mms.py #8

Open ericthewizard opened 4 years ago

ericthewizard commented 4 years ago

I just stumbled on your project and wanted to give you a heads up of a possible issue, in case you're not aware:

The FPI (and HPCA) moments data in the CDF files are stored at the start of the accumulation interval, while the FGM data (and most other instruments) are stored at the middle of the accumulation interval. This means you'll have to adjust the plasma (FPI/HPCA) timestamps to the center of the accumulation interval before doing any analysis/processing.

Fortunately, this is already handled by pySPEDAS with the center_measurement option to the FPI and HPCA load routines; you can load the FPI and FGM data into numpy arrays using pySPEDAS with the notplot option:

import pyspedas
fgm_np = pyspedas.mms.fgm(trange=['2015-10-16/13:06', '2015-10-16/13:07'], data_rate='brst', notplot=True)
fpi_np = pyspedas.mms.fpi(trange=['2015-10-16/13:06', '2015-10-16/13:07'], data_rate='brst', center_measurement=True, notplot=True)

fgm_np and fpi_np are hash tables where the keys map to hash tables containing the data, e.g.,

>>> fgm_np.keys()
dict_keys(['mms1_fgm_b_gse_brst_l2', 'mms1_fgm_b_gsm_brst_l2', 'mms1_fgm_b_dmpa_brst_l2', 'mms1_fgm_b_bcs_brst_l2', 'mms1_fgm_flag_brst_l2', 'mms1_fgm_r_gse_brst_l2', 'mms1_fgm_r_gsm_brst_l2', 'mms1_fgm_hirange_brst_l2', 'mms1_fgm_bdeltahalf_brst_l2', 'mms1_fgm_stemp_brst_l2', 'mms1_fgm_etemp_brst_l2', 'mms1_fgm_mode_brst_l2', 'mms1_fgm_rdeltahalf_brst_l2'])
>>> fgm_np['mms1_fgm_b_gse_brst_l2'].keys()
dict_keys(['x', 'y'])
>>> fpi_np.keys()
dict_keys(['mms1_des_errorflags_brst', 'mms1_des_compressionloss_brst', 'mms1_des_startdelphi_count_brst', 'mms1_des_startdelphi_angle_brst', 'mms1_des_sector_despinp_brst', 'mms1_des_pitchangdist_lowen_brst', 'mms1_des_pitchangdist_miden_brst', 'mms1_des_pitchangdist_highen_brst', 'mms1_des_energyspectr_px_brst', 'mms1_des_energyspectr_mx_brst', 'mms1_des_energyspectr_py_brst', 'mms1_des_energyspectr_my_brst', 'mms1_des_energyspectr_pz_brst', 'mms1_des_energyspectr_mz_brst', 'mms1_des_energyspectr_par_brst', 'mms1_des_energyspectr_anti_brst', 'mms1_des_energyspectr_perp_brst', 'mms1_des_energyspectr_omni_brst', 'mms1_des_numberdensity_brst', ....
>>> fpi_np['mms1_des_energyspectr_omni_brst'].keys()
dict_keys(['x', 'y', 'v'])

The x keys map to the unix timestamps (stored as a numpy array), the y keys map to the data values (also stored as a numpy array) and for spectra, the v keys map to the "y" values, e.g., the energy tables (for energy spectra); for PADs, the v keys map to the angles. e.g.,

>>> fpi_np['mms1_des_energyspectr_omni_brst']['x']
array([1.44500072e+09, 1.44500072e+09, 1.44500072e+09, ...,
       1.44500086e+09, 1.44500086e+09, 1.44500086e+09])
>>> fpi_np['mms1_des_energyspectr_omni_brst']['v']
array([[1.241000e+01, 1.591000e+01, 2.040000e+01, ..., 1.677878e+04,
        2.151425e+04, 2.758621e+04],
       [1.096000e+01, 1.405000e+01, 1.802000e+01, ..., 1.481758e+04,
        1.899954e+04, 2.436178e+04],
       [1.241000e+01, 1.591000e+01, 2.040000e+01, ..., 1.677878e+04,
        2.151425e+04, 2.758621e+04],
       ...,
       [1.241000e+01, 1.591000e+01, 2.040000e+01, ..., 1.677878e+04,
        2.151425e+04, 2.758621e+04],
       [1.096000e+01, 1.405000e+01, 1.802000e+01, ..., 1.481758e+04,
        1.899954e+04, 2.436178e+04],
       [1.241000e+01, 1.591000e+01, 2.040000e+01, ..., 1.677878e+04,
        2.151425e+04, 2.758621e+04]], dtype=float32)

When you load the data with the center_measurement=True option, the timestamps should be shifted to the middle of the accumulation interval.

Hope this helps!

I'm always available if you have any questions: egrimes (at) igpp.ucla.edu

edmondb commented 4 years ago

Hey! Thank you for pointing that out. We're actually a group of people participating in a Heliophysics HackWeek. Various skill levels, backgrounds, and knowledge about MMS. We'd love to have you collaborate with us, however, this HackWeek is ending today. Several of us want to continue, so you're expertise would be very welcomed.