In the code outbands will get selected as the minimum of the provided outbands value and the actual number of band tags in the data. This isn’t made clear to the user. It also leads to using an arbitrarily high number (e.g. 10) being used to select all bands which is pretty vague (if someone provides 10 bands – are there 10 bands in the data? Or is it a shortcut to getting all bands? Hard to tell from looking at a config file).
The outbands parameter is used as the stop for index slicing, e.g. bands[0: outbands]. Is this intentional? It means if you want the band at position [4] you can’t have it without getting bands 0, 1, 2 and 3. I understand band [0] is the predicition value so it makes to sense to always have it but what about the others?
The example in the cookbook gives a range of 1:5 for outband which corresponds to regression outputs [mean, variance etc.] but what if you’re using a classifier with more than 6 classes?
The method for selecting outbands needs to be looked at and made more explicit, or at the very least clearly documented.
How is the outbands parameter applied to classifiers?