Open akusok opened 9 months ago
In Federated methods, we will never see data (X, Y) directly. Instead we will use (H'H, H'Y) for training.
All code for each user is in a separate function (or separate code file). Only that code can read data (X,Y) for that user. Users share out the processed data (H'H, H'Y).
code example
W, B = generate_weights_and_biases()
user1_data = user_1_function(W, B)
def user_1_function(W, B):
x1, y1 = load_data() # only exists inside this function
H1 = ...
H'H = H1.T.dot(H1)
...
return (H'H, H'Y)
Hi, Anton. Sorry, I am confused now, Let sat the server call "user_1_function(W, B)" which returns (H'H, H'Y). Can the server call "user_1_function(W, B)" with an aggregate of H'H, H'Y (of C1, C2, C3). I couldn't figure out how the FL server managed the process.
You should have another function for that, like "compute_accuracy(HH, HY)".
Or you can make a user an object, that stores data of W, bias, HH_own, HY_own, [HH of other users], [HY of other users]. Then give that object functions like "get_data(W, bias) -> HH_own, HY_own", and "add_other_users_data(HH, HY)", "compute_accuracy()". You can create data structures and objects that you need, din't limit yourself to what we discussed.
The paper goes approximately like this:
Introduction
Experimental setup for federated learning in the paper
Present our implementation of federated ELM
New stuff in the paper