Open nbgl opened 7 years ago
Say I have a predictor p trained on data points x1, …, xn. This predictor has a variance of v at t.
Say I also have a predictor p' trained on x1, …, xn+1. This has a variance of v' at t.
Is it necessarily true that v' ≤ v? My conjecture is yes, but I don’t seem to be able to prove it.
I’m still curious about this. I wonder whether there’s a general result
Say I have a predictor p trained on data points x1, …, xn. This predictor has a variance of v at t.
Say I also have a predictor p' trained on x1, …, xn+1. This has a variance of v' at t.
Is it necessarily true that v' ≤ v? My conjecture is yes, but I don’t seem to be able to prove it.