kdmalc / personalization-privacy-risk

Privacy analysis for ML and classical filtering personalization parameters
0 stars 0 forks source link

Validate if final global model from earlier runs functions well as a good init for later trials/users/block2 #8

Closed kdmalc closed 1 year ago

kdmalc commented 1 year ago

For NoFL: Using previous decs result in a better init, potentially better long term convergance. Need to plot with ylim of 100 or something

kdmalc commented 1 year ago

For 1 Scipy Step, it appears to start better and then follow the exact same trajectory. Again need to plot with ylim to confirm ending cost

kdmalc commented 1 year ago

For 1 Scipy Step: Can't tell any difference when using the prev global dec vs using the prev corresponding client dec. Not sure why this would be the case

kdmalc commented 1 year ago

For 10 Scipy Steps: The init is much better but it appears that they converge to the same final loss. I guess this is what we would expect if it's actually converging to the min? Not sure how the sampling / randomness fits into this

kdmalc commented 1 year ago

Again, for 10 Scipy Steps: using prev global vs prev client decs are indistinguishable

kdmalc commented 1 year ago

Granted the final client decs are what generated the final global dec

kdmalc commented 1 year ago

As found 2/28 / 3/1, the resulting cost curve has a stronger dependence on the magnitude of the input init dec than the actual values of the input init dec. Thus, while the personalized models may (or may not) be better than the global or random inits, it is difficult to tell since the optimiziation method appears to be minimizing the dec instead of the error...

Also note that the reason the cost curves are the exact same is because the input inits decs have essentially the same magnitude.

Could change the penalty terms to try and correct for this? But that would alter the experiment. Could likewise run the code on different conditions and see what happens

kdmalc commented 1 year ago

This should have already been finished and presented in the 599 report.