First experiments show that neighbor based models will have a hard time coming up with a decent model for 6 features. It is also hard to get an intuition for how decisions are made.
It seems especially hard to fight overfitting. Just for the general intuition of overfitting, this can be useful
https://scikit-learn.org/stable/modules/neighbors.html#nearest-neighbors-classification
First experiments show that neighbor based models will have a hard time coming up with a decent model for 6 features. It is also hard to get an intuition for how decisions are made.
It seems especially hard to fight overfitting. Just for the general intuition of overfitting, this can be useful
https://scikit-learn.org/stable/auto_examples/model_selection/plot_underfitting_overfitting.html
For a first experiment we restrict ourselves to 2 features and use code like this
https://scikit-learn.org/stable/auto_examples/neighbors/plot_classification.html#sphx-glr-auto-examples-neighbors-plot-classification-py
to display the decision boundaries.
To come up with a good number of neighbors (and possibly other parameters) we could use hyper parameter search like the ones provided by https://scikit-learn.org/stable/modules/grid_search.html