Closed jpolchlo closed 2 years ago
Unmixing is the process of recovering the substance composition of sampled pixels. Unmixing algorithms are dependent on a model of mixing, which come in two forms:
Survey papers:
In contrast to data compression, dimension reduction is the process of identifying the input bands with the greatest power of discrimination for a given set of endmembers. This process is likely to be affected by the composition of the endmember set. If we consider a consistent sampling of a spectrum at known wavelengths as a vector, then a typical spectral library will exhibit small cosines between some pairs of its members. (This characteristic, called coherence or mutual coherence, ill-conditions the unmixing problem.) Dimension reduction might include an orthogonalization process for these basis vectors.
Some known methods:
For any unmixing operation, we will have a set of spectral measurements that constitute the substances of interest in the imagery. It is possible to use a spectral library as the foundation, or it is possible to infer endmembers from the imagery itself. This latter problem is complicated by the fact that it may be difficult to identify "pure pixels" which capture a single material.
We will likely stick to the use of reference endmembers that have been provided in a spectral library. But using laboratory-derived spectra has its own set of problems. From Quintano, et al (ref in above comment):
The disadvantages for using these spectra are that image correction is not trivial and errors are always introduced. In addition, spectral libraries are created in laboratorial [sic] conditions over controlled situation such as air-dried, grinded samples, artificial irradiance and minimal effects due to atmosphere.
Dimension reduction
In contrast to data compression, dimension reduction is the process of identifying the input bands with the greatest power of discrimination for a given set of endmembers. This process is likely to be affected by the composition of the endmember set. If we consider a consistent sampling of a spectrum at known wavelengths as a vector, then a typical spectral library will exhibit small cosines between some pairs of its members. (This characteristic, called coherence or mutual coherence, ill-conditions the unmixing problem.) Dimension reduction might include an orthogonalization process for these basis vectors.
Some known methods:
* PCA * Maximum noise fraction * Based on linear mixing model * Optical real-time adaptive spectral identification system (ORASIS) * Somewhat heuristic * Streams full pixels
It never hurts to include isomap in any dimensional reduction conversation.
I also remember seeing a lot of material in and around compressed sensing that might provide nice background for generating new ideas.
This epic deals with the elements of processing hyperspectral imagery. This mostly deals with unmixing, which is, roughly, the conversion from a raw signal consisting of bands of measured reflectances to planes of features. Presently, feature will be interpreted as the abundance of a material, but could possibly be some other derived quantity. The materials will have characteristic measured spectra, which are commonly called endmembers in the literature.
This epic will also cover methods that deal with feature extraction based on narrow band ranges for identifying known materials, such as hydrocarbons. These methods will likely use continuum removal and a goodness-of-fit test, such as MICA.
Issues under this epic will summarize an academic paper, or papers, and describe an actionable path that results from the research, or conclude that the line of inquiry is not productive for our interests.
Comments on this issue will summarize the various topics that should be investigated.