LabeliaLabs / distributed-learning-contributivity

Simulate collaborative ML scenarios, experiment multi-partner learning approaches and measure respective contributions of different datasets to model performance.
https://www.labelia.org
Apache License 2.0
57 stars 12 forks source link

Dynamique change of mplc parameters during the training #242

Closed arthurPignet closed 4 years ago

arthurPignet commented 4 years ago

Some contributivity measurement methods need to change some hyperparameters during the training ( for instance the weights of the aggregation, or, the list of partners with whom the training epoch will be made)

For now, it is done artificially by stopping the training, and re-run a new mpl, initialized with the model of the previous training, for one epoch.

One solution can be add an agent ( new class), which is passed as parameter to mpl. At each batch/epoch, the agent can be requested by mpl, and gives the new parameters.