GEUS-Glaciology-and-Climate / pypromice

Process AWS data from L0 (raw logger) through Lx (end user)
https://pypromice.readthedocs.io
GNU General Public License v2.0
14 stars 4 forks source link

Package restructuring #106

Closed PennyHow closed 1 year ago

PennyHow commented 1 year ago

The pypromice package has been restructured into the following modules:

  1. pypromice.process, which contains the AWS object (pypromice.process.AWS) and associated object handling (pypromice.process.<FUNCTION>). The processing levels come under this module as sub-modules (i.e. pypromice.process.L0toL1, pypromice.process.L1toL2, pypromice.process.L2toL3)
  2. pypromice.tx, which contains all tx functions and objects, e.g. pypromice.tx.L0tx
  3. pypromice.get, which contains all data retrieval functions (as previously)
  4. pypromice.postprocess, which contains all post-processing and BUFR formatting workflows (as previously)

Additionally, look-up tables (e.g. data_urls.csv, variables.csv) are now distributed as part of the pypromice package (specified in the MANIFEST.in file). In all functions, if a file directory for these files is not provided by the user then the package file is called instead. For example, in bin/getL3 you can call this without specifying the variables and metadata look-up table paths and it will automatically call the files from the pypromice package.

So this, to use the package look-up tables:

$ getL3 -c aws-l0/tx/config/TUN.toml -i aws-l0/tx/ -o aws-l3/

Or this, to specify look-up tables in a given directory:

$ getL3 -c aws-l0/tx/config/TUN.toml -i aws-l0/tx/ -o aws-l3/ -v ./variables.csv -m ./meta.csv