micasense / imageprocessing

MicaSense RedEdge and Altum image processing tutorials
https://www.micasense.com
MIT License
257 stars 152 forks source link

Calculating Burned Area Index results in one-color figure #149

Closed John2397 closed 3 years ago

John2397 commented 3 years ago

Hello @poynting

I'm following Micasense tutorial (Plant Classification) on how to plot NDVI.

Then I tried injecting the following code to calculate the Burned Area Index (BAI) as well

red_square = (0.1 - im_aligned[:,:,red_band]) * (0.1 - im_aligned[:,:,red_band])
nir_square = (0.06 - im_aligned[:,:,nir_band]) * (0.06 - im_aligned[:,:,nir_band])
burned_area_index = 1 / (red_square + nir_square)

fig, axis = plotutils.plot_overlay_withcolorbar(gamma_corr_rgb, 
                                    burned_area_index, 
                                    figsize = figsize, 
                                    title = 'BAI',
                                    vmin = min_display_ndvi,
                                    vmax = max_display_ndvi)

This is the output figure:

Screenshot from 2021-05-05 19-48-21

I know this is not a bug or some installation issue but I would appreciate it if you could provide some guidance that would help me to correctly plot the BAI. For example, should the minimum and maximum values for BAI be different from NDVI? Or why are the values of burned_area_index all so high? I know that there is no fire on the given sample photos but does that justify the one-color output?

Thank you.

poynting commented 3 years ago

Can you share the data? Ideally that would include at least one or two full captures from the air and a panel capture if you have it. If you don't want to share in a public forum, please email to support at micasense and reference this ticket.

poynting commented 3 years ago

OH! I'm not super familiar with BAI, but the reason the chart is all yellow is because the display min and max are beign set using the min_display_ndvi and max_display_ndvi variables from the NDVI display. BAI has a much different index range than NDVI.

Try this:

red_square = (0.1 - im_aligned[:,:,red_band]) * (0.1 - im_aligned[:,:,red_band])
nir_square = (0.06 - im_aligned[:,:,nir_band]) * (0.06 - im_aligned[:,:,nir_band])
burned_area_index = 1 / (red_square + nir_square)

bai_hist_min = np.percentile(burned_area_index,0)  #set the minimum value of the histogram x-axis
bai_hist_max = np.percentile(burned_area_index,99) #set the maximum value of the histogram x-axis
fig, axis = plt.subplots(1, 1, figsize=(10,4))
axis.hist(burned_area_index.ravel(), bins=512, range=(bai_hist_min, bai_hist_max))
plt.title("BAI Histogram")
plt.show()

min_display_bai = np.percentile(burned_area_index, 5)  #set the minimum value of the BAI colorization (purple)
max_display_bai = np.percentile(burned_area_index, 95) #set the maximum value of the BAI colorization (yellow)

fig, axis = plotutils.plot_overlay_withcolorbar(gamma_corr_rgb, 
                                    burned_area_index, 
                                    figsize = figsize, 
                                    title = 'BAI',
                                    vmin = min_display_bai,
                                    vmax = max_display_bai)

image image

Since BAI is higher in value for pixels that are dark in red and NIR, we see that in this image it mostly highlights the shadows which are dark in both bands.

John2397 commented 3 years ago

Thank you for the response I appreciate it.

As far as I know BAI requires reflectance values, which is a ratio, and this is the reason I thought the range should be between 0-1. On the other hand, according to this answer, NDVI doesn't require this conversion from Digital Numbers to Reflectance.

I'm a little confused to be honest. Are the values in im_aligned in reflectance? I can see in the beginning of the tutorial that the reflectance images are plotted but they are not saved somewhere.

So if the values in im_aligned are not calibrated for reflectance, how do we get a correct output for BAI with uncallibrated photos? Should I align the 5 reflectance photos ("flightReflectanceImage") instead of the raw ones? ( I believe on the tutorial the raw tifs are the ones that get aligned and not the reflectance photos, right? )

Thank you again.

poynting commented 3 years ago

Yes, because panels are provided in the first cell, the resulting im_aligned are floating point reflectance (1.0=100%). If a panel isn't provided, DLS data will be used if it's present, which will also result in reflectance images. Without either, the code falls back to radiance, which would not be useful for BAI.

poynting commented 3 years ago

Two other things to address in the previous question:

While NDVI can in some cases be taken as a simple ratio of digital numbers, that only works in a consistent and repeatable way if the digital numbers both represent the same radiance scale with the same irradiance in each band (which effectively means DN is directly proportional to reflectance). That's rarely the case, so we need to translate DN to reflectance in order to get repeatable data. This can be seen through a quick thought experiment. Say we have a camera that returns DN directly proportional to radiance, but the sensor is of a different sensitivity in each of the Red and NIR bands (as most sensors are). If each camera is also adjusted by gain or exposure to maintain the whole usable DN range, we will find that an image from each camera should have similar DN ranges, or the red DNs could even be higher than the NIR DNs (due to sensitivity/lighting/reflectance differences in the scene). Computing a normalized vegetation index from those DNs will be consistent only when the collection experiences the same irradiance, and then only consistent with itself. It won't be comparable to say a satellite NDVI or a different camera without some further calibration. Generally for VIs we always want to work in the reflectance space.

As to BAI. The BAI is not a normalized index, so it doesn't maintain a range of -1 to 1 like other normalized indices. Instead, it is a geometric distance index, that gets arbitrarily big as the values of the two reflectances approach the 'setpoints' which in BAI are 0.1 for red (10% reflectance) and 0.06 for nir (6% reflectance). In fact it get so big (due to the denominator approaching zero) that if both values actually match the setpoints the code will likely return +Inf or NaN, and this could result in 'holes' in the output image. Here's a quick snippet that shows the response of BAI for varying values of NIR and red. I've arbitrarily capped the values at 100 just for display; in reality of course the 'peak' is infinitely high.

%matplotlib notebook

from matplotlib import cm
from matplotlib.ticker import LinearLocator

step = 0.01
xlim = 0.6
ylim = xlim
zlim = 100

red_test = np.arange(0,xlim,step)
nir_test = np.arange(0,ylim,step)
R,N = np.meshgrid(red_test, nir_test)

red_test_square = (0.1 - R) * (0.1 - R)
nir_test_square = (0.06 - N) * (0.06 - N)
burned_area_index_test = 1 / (red_test_square + nir_test_square)

burned_area_index_test[burned_area_index_test > zlim] =  zlim

fig, ax = plt.subplots(subplot_kw={"projection": "3d"},
                       figsize = figsize)

surf = ax.plot_surface(R,N,burned_area_index_test, 
                       cmap = cm.coolwarm, 
                       vmin=0,
                       vmax=100,
                       linewidth=0, 
                       antialiased=False,
                       rcount = 100,
                       ccount = 100,
                       shade = True)

# Customize the z axis.
ax.set_zlim(0, 100)
ax.set_xlim(0,xlim)
ax.set_ylim(0,ylim)
ax.set_xlabel('Red Reflectance')
ax.set_ylabel('NIR Reflectance')
ax.set_zlabel('BAI')
ax.zaxis.set_major_locator(LinearLocator(10))

# Add a color bar which maps values to colors.
fig.colorbar(surf, shrink=0.5, aspect=5)

plt.show()

image

The neat thing about geometric distance indices is

  1. They can easily be extended to other bands
  2. They can be used to classify all sorts of things where you know the thing you're looking for is close to the setpoints in each band. For example, change the setpoints to Red = 0.05 and NIR = 0.40, and the index will highlight mostly the healthy vegetation canopies as shown in the "badge of trees" in the Tassled cap plot at the bottom of the analysis notebook.
John2397 commented 3 years ago

Thank you a lot for the time and effort you put into helping me. Both of your answers cleared up much of the confusion I had. I really appreciate it!

So reflectance values are always required when comparing data taken in different time/day or from different sensor, in order to remove the effect of the atmospheric conditions.

As for the BAI, it seems like for some reason I thought the range was 0 to 1. I wasn't aware of the true range indeed (and the fact that it is not a normalized index).

I hope I can get some accurate results with BAI although people tend to prefer NBR when it comes in detecting burned areas as it provides better results. Too bad micasense doesn't provide the SWIR band needed for NBR index! Maybe considering both BAI and thermal bands could provide more accurate results.

Anyway, thank you again!

poynting commented 3 years ago

Sure, BAI is a particular one that I wasn't familiar with, so it was a good learning experience. I'd love to see how this works out, and if you have data that you can share, shoot me an email via support at micasense.

I also agree SWIR would be great. Unfortunately the detectors haven't found a mass market application so the most 'affordable' cameras tend to be more expensive than a whole Altum!