terraref / extractors-hyperspectral

Scripts and code relevant to the SWIR and VNIR cameras.
BSD 3-Clause "New" or "Revised" License
6 stars 6 forks source link

Add the enviornmental logger finder #16

Closed FlyingWithJerome closed 7 years ago

FlyingWithJerome commented 7 years ago

This finder will take

It should be called in the following pattern:

python hyperspectral_flux_based_calibration.py "09/22/2016 13:53:44" /projects/arpae/terraref/sites/ua-mac/raw_data/EnvironmentLogger # This is for ROGER production

And the nearest two (regarding to the time) environmental logger records will be printed out to the stdout.

More details are in the docstring of the script

FlyingWithJerome commented 7 years ago

@dlebauer @czender

I checked the environmental logger files from April, and seems that we have a little mess around there. We have the files end with "enviromentlogger.json" and the files end with "environmentlogger.json." I believe it is because of the typo, and I hope someone who has the write access to /project folder can clean them up.

Another situation is that the earlier (for example, April) loggers were collected once two minutes, and the later loggers (for example, September) were collected hourly.

czender commented 7 years ago

@FlyingWithJerome EL files are aggregations of individual sample times. It does not matter that the length of the aggregation has changed from 2 minutes to one hour. It's the same amount of data, right? As for the typo, yes, but please create a separate issue for that so @delbauer or @max-zilla can rename the affected files. That has nothing to do with this pull request.

FlyingWithJerome commented 7 years ago

@czender Done. But please do not merge right now. I just renamed the file, but the Git accidentally committed and pushed my modification in the source code as well. Right now this file is not executable because I'm updating the algorithm; it will be back soon.

FlyingWithJerome commented 7 years ago

@czender It now can search for the nearest two records among the EL files, extract two nearest flx_spc_dwn arrays, take their weighted average and write into an output netCDF file.

EL works 24*7 but HS only works 16 hours, so we do not need to handle the records cross days, but I still need to handle an edge case that it will take the last record from file_1 and the first record from file_2. Other than that, the script can work normally in most cases in both my home machine and ROGER.

The output netCDF file will only have two variables and one dimension: dimension: wavelength variables: weighted_average_downwelling_irradiance wavelength

FlyingWithJerome commented 7 years ago

@czender I had added the test on the output reflectance graph in the hyperpspectral_test.py as what we discussed yesterday. The new commandline for the hyperspectral_test.py is

python hyperspectral_test.py <input_netCDF_file> <verbosity_level> <maximum_saturated_exposure>

Now the test script requires a new argument "maximum_saturated_exposure," but you don't need to change the workflow.sh right now because I set maximum_saturated_exposure as a default parameter (with default value of 0) and can be overridden with the input value. This test is marked an expected failure until we finish the new calibration workflow and make sure that it can work as what we want.

max-zilla commented 7 years ago

issue created for correcting the typo: https://github.com/terraref/computing-pipeline/issues/287