Open bean217 opened 8 months ago
Do you have a reference for the correctness of taking Silverman's rule in each dimension as you currently do?
In one dimension, the width of a kernel is a single number. In two dimensions it is three numbers: the two diagonals of the covariance matrix and the off diagonal element. If you compute the 1D Silverman estimate along each dimension, then you do not account for correlations between the variables.
Have you thought about this? Have you looked at the literature?
I fear that generalizing Silermans rule to higher dimensions is not as easy as your current code sketch makes it out to be (?).
I apologize for my extremely naive implementation. I had made some pretty terrible assumptions, so I'm going to actually read up on this some more rather than continuing to submit something incorrect.
No worries, have a look at it! If you manage to figure it out it can improve the library.
silvermans_rule
algorithm is essentially the same, but with some handling for multidimensional data thrown in there. Also wrote some tests for checking the shape of the output.