Closed awhoward closed 9 months ago
I think there is a path issue with the Junk_file_list. Outside of running docker, I see the file at /data/kpf/reference/Junk_Observations_for_KPF.csv
But this is my error when I try to run the script inside of docker:
root@shrek:/code/KPF-Pipeline/scripts# python kpf_processing_progress.py 20231207 20231208
File of junked observations not found: /data/kpf/reference/Junk_Observations_for_KPF.csv
Junked file not ignored.
DATECODE | LAST L0 MOD DATE | 2D PROCESSING | L1 PROCESSING | L2 PROCESSING
------------------------------------------------------------------------------
------------------------------------------------------------------------------
I rewrote the script that checks on KPF processing status. It's now in Python and runs a lot faster. It also incorporates the two requested features in Issue https://github.com/Keck-DataReductionPipelines/KPF-Pipeline/issues/737.
Planned QC and Diagnostics checks will put keywords in the headers that can be checked by this script to avoid reprocessing files that have known problems.
Here's the new docstring that explains how it works.