COMPASS-DOE / data-workflows

Sensor data workflows and processing scripts
MIT License
4 stars 0 forks source link

1min and 5min TEMPEST data #112

Closed bpbond closed 1 month ago

bpbond commented 7 months ago

These have different tables and variable names, making them a PITA. I am removing them from the 0.9 release; we can discuss how to handle for 1.0

bpbond commented 4 months ago

@stephpenn1 I am inclined to leave these TEMPEST-flood-specific detailed data out of the standard L1 data. Is that OK?

stephpenn1 commented 4 months ago

Yeah I think thats okay? We could always publish an L2 version as a supplement later

bpbond commented 4 months ago

I think so, and it makes our lives a lot simpler now. I'll leave this issue open so we don't lose track of it, but remove from #115

Thanks!

bpbond commented 2 months ago

Coming back to this after discussion with @stephpenn1 , and testing behavior of the existing system, this morning.

The "_1min" and "_5min" tables are turned on and off by @roylrich before and after the TEMPEST floods...but sometimes they continue all year long; the Dropbox data are inconsistent. To add them to our pipeline:

All in all this seems fairly straightforward, but thoughts welcome @stephpenn1 @roylrich

bpbond commented 2 months ago

This (third bullet above) is a VERY easy change in L0.qmd 8b85ec2108262467544a823bfc3af89aec3ce01b:

    # read_datalogger_file() adds a "Table" column giving the name of the datalogger table
    # If this ends with "_1min" or "_5min" then strip that out; this refers to the temporary
    # tables enabled during the TEMPEST flood events. We want these processed as normal
    # data, just on a finer time resolution
    PATTERN <- "_[0-9]+min"
    if(any(grepl(PATTERN, dat$Table))) {
        message("\tRemoving _{num}min from table name: ", dat$Table[1])
        dat$Table <- gsub(PATTERN, "", dat$Table)
    }