Closed the-programmerr closed 1 year ago
Hi! Theoretically, using the KNN classifier in federated/distributed settings would be possible. However, I would like to note that it must involve a few critical (and different) steps. Firstly, there is not much training for the KNN. One can say that saving the training dataset (features and labels) is called training; however, if we used that notion, it would mean that we are going to send all the data (I guess instead of weights) to the server - for a global model (and the training step would involve just one step). It would violate the federated settings assumption that the data is local since we bring it to the server. Then it would be regular centralized learning. So, let's drop this approach and look at an alternative. The alternative is to send data to all/fraction of the client for inference (we won't have a single global model). And as a response, we would get, e.g., k (or more) distances with labels. Then globally, we would do the voting on the class we're going to predict. In this scenario, we would have to encrypt the data. So, I think that finding/implementing a specific library for that purpose would be a more appropriate step since there are no traditional steps like local fitting and evaluation. I hope that answers your question.
@the-programmerr, as there was no further comment on this for three weeks and @adam-narozniak has answered the question, I am closing the ticket. @the-programmerr feel free to open it again if any other questions should arise.
Hi @the-programmerr , were you able to find any solutions on this? I couldn't find anywhere assuming that no one has done before. Let me know if you find any. Thanks!
What is your question?
hello, I have one question,
can I use this framework in "KNeighborsClassifier" model?since it is a non-parametric model