Closed awhoward closed 8 months ago
Here is a file from Jump listing all exposures from August 2023 with etalon light in the CAL fiber and Texp >= 60 sec. It contains ~2000 lines, which is a lot, but is the scale that we're going to need a solution for.
Here is a smaller file listing all dark exposures (20 min) from August 2023. This one has 155 lines.
Here's some advice from ChatGPT about avoiding reading the image extensions to keep memory usage low.
This Issue captures notes from BJ and Andrew about creating 'monthly' plots, e.g. a time series of some parameter (RV, dark noise, a temperature) over time.
First, we will need a list of appropriate files. We considered querying the Jump database and KOA. The basic query will select list of files (ObsIDs) matching a date range, and observation type, and perhaps some parameters. The code that does this querying should be put into the KPF-CPS repository if it is Jump-specific.
Second, in the KPF-DRP repository, we should make plotting methods (parallel to
modules/Utils/analyze_l0.py
. These will take as inputs a list of ObsIDs (or paths) and have the output be a png in a specified directory.We will need to determine how to extract the information from each keyword. Reading the files with the KPF data model will be very memory-intensive. We might use the header-only reading method in Astropy FITS IO (check this.)
We also talked about how this should run as a scheduled cron job that would start a docker instance and run a special recipe of the KPF DRP. BJ has experience setting this up.
As a proof-of-concept, let's use Jump on the web to make a list of, e.g., etalon exposures in a one-month period. Then in a Jupyter notebook, make a plot of one of the RVs for each exposure over time.