-
We now support permutation importance, maybe it's reasonable to support feature selection method based on it (e.g., remove features whose premutation importance < 0).
-
Has anyone compared RF feature importance to PPS?
My one concern with PPS is that it is a univariate calculation. Even in the matrix, the numbers correspond to a single feature's predictive power o…
-
### Is the feature request related to a problem? Please describe.
For many RSS feed contacts the avatar is the default avatar. This makes it difficult to visually look for certain content. Being ab…
-
## Motivation
I believe the current DQN Losses don't apply importance sampling weights. This is almost always applied when using a PER Buffer.
## Solution
The PER buffer outputs "_weight" in in…
-
## Description
Importance weighted moment matching (IWMM) is an implicitly adaptive importance sampling method that improves proposal distributions in importance sampling by performing affine trans…
-
**Is your feature request related to a problem? Please describe.**
I currently maintain [LOFO Importance](https://github.com/aerdem4/lofo-importance) package which is a model agnostic, validation sc…
-
When sorting features in bing tiles the tests fail on property comparison.
For example, with bing tile 4-8-5:
Here is an MVT property:
> mvtProperties: {label-importance=6, has-icon=true, …
-
**Is your feature request related to a problem? Please describe.**
Sample weights are relevant in machine learning to describe the importance of a sample to the model.
For example, regression based …
-
Hi all,
For the Permutation feature importance procedure, the default iteration value `n_iter` is 5 .
See: https://eli5.readthedocs.io/en/latest/autodocs/sklearn.html#eli5.sklearn.permutation_im…
-
I recently [came across](https://explained.ai/rf-importance/index.html#intro) the idea of grouping features that are of particular interest during permutation importance, that is, shuffling more than …