Open NewcombMaria opened 6 years ago
@NewcombMaria since Roger was decommissioned we stopped the Globus transfer for most of the past month and the latest raw SWIR data that we have is from Dec 4. However, we don't currently have a pipeline for processing it to useful data since it depends on the new radiometer for calibration.
For evaluation it can be run it through the hyperspectral pipeline, which is well documented here: https://github.com/terraref/extractors-hyperspectral. If it can wait, I think this would be a good place for @remotesensinglab and his group to get started.
Thanks @dlebauer . We need at least a first-look data comparison as soon as possible between the data collected Dec 21, 2017 and the data reported in the Acceptance Test (Nov 10-12, 2016). It makes most sense to use the same processing method that was used for the Nov 2016 data. We need to know right away if there's an obvious problem with the data we are collecting, in which case we'll pull the SWIR camera immediately for shipment to Headwall.
The data as raw and radiance from Nov 2016 are reported in the acceptance test document: https://drive.google.com/drive/folders/0Bw-g8IAe5V-ONFpyMXJZcDlaaTg
How were these data processed? Can the same protocol be followed with the Dec 21, 2017 data in the near future? It's important to get a first-look approximate QC check because the SWIR instrument has sub-optimal mirror and sub-optimal internal cooling fan and we don't know the impact on the data. If the data are poor quality the unit needs to be sent to Headwall.
We have the data on the cache server. If we can get it transferred to @solmazhajmohammadi or @remotesensinglab directly from the cache server would it be possible to get a quick data review or comparison to earlier data from fall 2016?
We will look at it ASAP. thanks
@NewcombMaria If we want to check the quality of the data ASAP, we can run an image quality index to every spectral band of SWIR data from 2017-12-21 and compare the results with Nov 10-12, 2016. I found the data captured on Nov 10-12, 2016 in Globus. Once the SWIR data from Dec. 21, 2017 are available, I can run the test and generate a comparison result.
Thank you @Paheding and Wasit! I looked at the field scanner Cache Server and there are 7 data folders from Dec 21. This morning I'll download the folders and see if I can transfer them to either a google drive folder or box folder so you can access them. The largest file in each folder is the raw file which might take time to transfer through google folders. If there's a better transfer method from the cache server let me know.
@NewcombMaria @Paheding I just started a manual transfer via Globus of the 2017-12-21 SWIR data from gantry to NCSA. It's underway and I will update once done.
Excellent! Thanks @max-zilla
Still moving - 120 GB moved so far. Looks about 90% done.
@max-zilla Sounds good, thank you.
The transfer has been completed. 216 GB, ~5 hours.
We were a bit confused by the massive transfer size here so we took a closer look at the data. It looks like the last data stripe 2017-12-21_12-59-48-385 went rouge and took a 204GB image. There's still a good scan near the beginning of the crazy stripe but I don't know how you'll isolate it.
@remotesensinglab @Paheding if there is interest in processing a >200GB file let sign up for a user account at xsede.org and I can give you access to the bridges computer with a 1TB RAM node that can handle it in our current pipeline.
However @smarshall-bmr in general it will easier if we can avoid files that large that require special hardware like this.
@Paheding @remotesensinglab and @dlebauer there are 6 'good' data folders of normal size and I suggest only working with those data. There's no need to process the massive data file in the folder with the latest time stamp for that day (Dec 21, 2017). Apparently it looks like the camera never received a stop command and continued to take images in place at the end of the run.
We scanned the targets and panels at two different exposures, 25 and 35. The metadata files have the exposure. Let me/us know if there are any questions. Many thanks for working on this QC task.
@NewcombMaria @dlebauer I found there are 5 folders have been uploaded to Globus after Dec. 4, see the images below. Shall I process the data from these 5 folders? If yes, we may not need the bridge computer.
@Paheding the SWIR data from 2017-12-21 has the 'test scan' results when we set out the spectralon panels and other targets. Within the 2017-12-21 folder there should be a total of 7 dataset folders with timestamps. The last folder for the day has a raw data file that is unusually large, and that dataset can be ignored because it's too large to reasonable process.
The other dates (Dec 4, 6, 8, 12, and 23) do not have the spectralon panels and don't need to be evaluated at this time for the quality control check.
@NewcombMaria I have conducted experiments to evaluate the quality of every spectral band. The evaluation score is shown in the following figure.
In this figure, the SWIR data are from '2016-11-14' and '2017-12-21'. As seen in this figure, the overall image quality tends to have the similar trend or signature, however, there are some unique tunning points when comparing these two data, which may be caused by various noises in the data. We are going to evaluate more images and analyze their characteristics.
In addition to evaluation, we will evaluate the SWIR quality from object-based spectral profiles, image resolution, etc., since one factor typically may not determine the actual quality of the data.
Note that I was intending to use the data from Nov 10-12, 2016 as suggested by @NewcombMaria, however, I just found that SWIR images are empty at those days, see here:
Thanks @Paheding. Nice work. We thought it would be good to compare dates when the spectralon panel was in the images, which is why we suggested the Nove 2016 date used in for the Acceptance Test report. @solmazhajmohammadi and/or @dlebauer, are the raw data (SWIR images of panels) stored someplace different? Can you direct Patrick to these data?
The quality index graph is interesting. The 2016-11-14 images were of sorghum season 2 when plants were mature, a couple weeks before harvest, which is a different target compared to early-growth wheat that is in the field now. I don't know how scratches in the mirror would impact the image quality across spectral bands. From 200 and above could be a concern? Or is that area of decreased quality due to differences in targets?
@NewcombMaria We will do in-depth analysis on the quality once we have the data from suggested date.
@NewcombMaria BTW, for the quality index shown in the figure, usually if the value above 10 indicates marginal or unacceptable image data.
@solmazhajmohammadi did you use data from Nov 10-12 2016 for the original data check? Here we have 0B files.
It looks like the time stamps from your report aren't consistent with the timestamps in our raw_data folder. Did you run the scanner without writing the output to the pipeline?
from the acceptance test:
from raw_data/SWIR/2016-11-10/ in our raw_data folder:
We have analyzed the recent SWIR data that associated with minor scratches, it is found that there is no obvious abnormality in the data due to minor scratches in terms of spatial and spectral information. Here is a report of analysis regarding this issue:
SWIR_Data_QC_Scratches.pdf
Thank you @Paheding. The summary report is excellent. We'll discuss as a group, but based on your data quality assessment it looks like we can continue collecting SWIR data this season with confidence in the quality of data. This is great news. Many thanks!
Can we add an extractor (or build into hyperspectral pipeline) that runs qa/qc quality score ... 1) to find errors and 2) to quantify quality as metadata
Objective: test current data coming from the SWIR to earlier data to determine potential impacts due changes in the mirror.
There are scratches in the mirror in the SWIR hyperspectral instrument in the camera box. Based on information from Headwall, scratches in the mirror in the SWIR have less impact on data quality compared to scratches in the VNIR mirror. We are hoping that the scratches in the SWIR mirror are minor enough that they are not impacting the quality of the data. We ran a test scan yesterday (2017-12-21, winter solstice) at two different exposures (25 and 35) to compare the results to earlier data without scratches in the mirror. Stuart, John and I completed the test scan together and questions can be directed to all of us (@smarshall-bmr and @jtheun ).
For the test scan we placed targets that we have available in the field-of-view. The best comparison data from an earlier date with sun-exposure probably is from the November 2016 acceptance test reports, see the 3 documents at the google drive directory https://drive.google.com/drive/folders/0Bw-g8IAe5V-ONFpyMXJZcDlaaTg
@solmazhajmohammadi and/or @remotesensinglab and/or @dlebauer, do you know who can process the SWIR test scan results from 2017-12-21 and compare to earlier data?