EnSpec / hytools

Hyperspectral image processing library
GNU General Public License v3.0
56 stars 25 forks source link

Docs for BDRF correction #3

Open mjb-oz opened 3 years ago

mjb-oz commented 3 years ago

Hi all,

Congrats and thanks for building such a fantastic set of open source tools. It's awesome to see this stuff being made free to all without an ENVI license.

I see on https://hytools.readthedocs.io/en/latest/algorithms.html that the docs for the BDRF and other corrections are still in the works. I'd love to be able to test the BDRF code and see if it works for us. Do you have a script that works for you that I could use as the basis for my testing? I can see from https://github.com/EnSpec/hytools/blob/master/examples/hytools_basics_notebook.ipynb how to apply the coefficients, but it'd be great to see how to properly create the coefficients...

Sorry if this isn't the best place to ask, I didn't see any obvious contact info listed on GitHub or readthedocs.

Cheers,

adamchlus commented 3 years ago

Hey Mike,

Thanks for your interest in HyTools, in the scripts folder there is a script, image_correct.py, which calculates the correction coefficients for an image or group of images and optionally applies the corrections. Within that folder there is a configs folder which contains a script, image_correct_json_generate.py, for generating the config file that is input into image_correct.py.

Sorry about the lack of info on the readthedocs site, we hope to have detailed instructions up there soon. In the meantime this slide deck provides an overview of the Hytools functionality, including the command line scripts:

HyTools slideshow

If you have any questions or come across any issues let me know.

-Adam

mjb-oz commented 3 years ago

Hi Adam,

Thanks for responding. No dramas about the docs slightly lacking - you guys have already done such great work writing and sharing this code. It's hard to do everything amazingly - there's just not enough time in the day!

Thanks for the slide deck. The BRDF correction slides simply reinforce why I'm so interested in your work. They look really good.

I've found those python files, and had figured that they are the key. The trick is setting up the JSON properly, I think. To that end, I've got a couple of questions just to check that I'm on the write track:

Assuming the following data files exist:

where HSI_flightline_00x_spectraldata is a typical ENVI file, and where HSI_flightline_00x_anc_data is a coaligned raster with 10 bands corresponding to the aviris_anc_names.

If so, then dict would then look like:

config_dict['anc_files'] = {'HSI_flightline_001_spectraldata.img' : {
                            'path_length':['HSI_flightline_001_anc_data.img', 1],
                            'sensor_az':['HSI_flightline_001_anc_data.img', 2],
                            'sensor_zn':['HSI_flightline_001_anc_data.img', 3],
                            'solar_az':['HSI_flightline_001_anc_data.img', 4],
                            'solar_zn':['HSI_flightline_001_anc_data.img', 5],
                            'phase':['HSI_flightline_001_anc_data.img', 6],
                            'slope':['HSI_flightline_001_anc_data.img', 7],
                            'aspect':['HSI_flightline_001_anc_data.img', 8],
                            'cosine_i':['HSI_flightline_001_anc_data.img', 9],
                            'utc_time':['HSI_flightline_001_anc_data.img', 10]
                            },
                'HSI_flightline_002_spectraldata.img' : {
                            'path_length':['HSI_flightline_002_anc_data.img', 1],
                            'sensor_az':['HSI_flightline_002_anc_data.img', 2],
                            'sensor_zn':['HSI_flightline_002_anc_data.img', 3],
                            'solar_az':['HSI_flightline_002_anc_data.img', 4],
                            'solar_zn':['HSI_flightline_002_anc_data.img', 5],
                            'phase':['HSI_flightline_002_anc_data.img', 6],
                            'slope':['HSI_flightline_002_anc_data.img', 7],
                            'aspect':['HSI_flightline_002_anc_data.img', 8],
                            'cosine_i':['HSI_flightline_002_anc_data.img', 9],
                            'utc_time':['HSI_flightline_002_anc_data.img', 10]
                            }
                    }

Assuming this all looks good, then the questions I have are really about details (and they could possibly be answered with a .hdr file from your ancillary data raster...:

Sorry for all the details - I'm sure you've written this to work with the ancillary data coming straight out of your hyperspectral sensor's processing workflow so it might be a level of detail that you take for granted.

Cheers,

mjb-oz commented 3 years ago

Quick follow up question:

I'm assuming sensor azimuth is the direction the sensor was pointing (ie along track direction). What does sensor zenith correspond to? Is this the nadir angle (ie the angle from the point on the ground to the sensor), or is it more the zenith of the sensor relative to vertically down? If it is the latter, is this simply pitch (ie along track angle), or is it the combination of pitch and roll into a single angle?

adamchlus commented 3 years ago

Hey Mike,

Indices are 0-based.

The example ancillary dataset structure is based off of NASA's AVIRIS data products, which is one of the main data products we work with. Here is more detailed info about each of the bands:

       1) path length (sensor-to-ground in meters)
       2) to-sensor-azimuth (0 to 360 degrees clockwise from N)
       3) to-sensor-zenith (0 to 90 degrees from zenith)
       4) to-sun-azimuth (0 to 360 degrees clockwise from N)
       5) to-sun-zenith (0 to 90 degrees from zenith)
       6) solar phase (degrees between to-sensor and to-sun vectors in principal plane)
       7) slope (local surface slope as derived from DEM in degrees)
       8) aspect (local surface aspect 0 to 360 degrees clockwise from N)
       9) cosine i (apparent local illumination factor based on DEM slope and aspect 
        and to sun vector, -1 to 1)
      10) UTC time (decimal hours for mid-line pixels)

https://avirisng.jpl.nasa.gov/dataportal/ANG_L1B_L2_Data_Product_Readme_v02.txt

For the BRDF and Topographic corrections only the following are needed:

Everything else is calculated internally.

Sensor azimuth is the direction each detector element is pointing clockwise from north, typically in the cross track direction it varies varies about +/-90 degrees from the flight direction, assuming the plane is tilted upward. Sensor zenith is the per-pixel angle in degrees off nadir.