From a discussion with a valued colleague, who raised the prospect of using different approach to the "band edge detection" part of the algorithm.
Instead of having the user specify bg.limits, could we implement a function that finds the linear fit that a) maximises the fitted number of points, while b) maximising some goodness-of-fit measure (the latter remains to be defined).
I think such an approach, if implemented naively, would result in a large computation penalty.
The major upside is the elimination of two (possibly four) of the six input parameters.
From a discussion with a valued colleague, who raised the prospect of using different approach to the "band edge detection" part of the algorithm.
Instead of having the user specify
bg.limits
, could we implement a function that finds the linear fit that a) maximises the fitted number of points, while b) maximising some goodness-of-fit measure (the latter remains to be defined). I think such an approach, if implemented naively, would result in a large computation penalty.The major upside is the elimination of two (possibly four) of the six input parameters.