bias-variance decomposition: what exactly is variance
hi,
In bias-variance decomposition derivation I understand that the bias term is a measure of the systematic error in prediction of point f(x0). For the variance it quantifies how much variation there is in the predicted fs(x0) when the training set varies. So from my understanding variance refers to the way the trained model changes. So I don't understand why in the first 2-3 pages variance is referred to as 'variance of LD(fS)', how do we show these are correlated? How do we extend the notion of variation of one point's prediction to the variance of expected risk over the distribution D?
bias-variance decomposition: what exactly is variance
hi,
In bias-variance decomposition derivation I understand that the bias term is a measure of the systematic error in prediction of point f(x0). For the variance it quantifies how much variation there is in the predicted fs(x0) when the training set varies. So from my understanding variance refers to the way the trained model changes. So I don't understand why in the first 2-3 pages variance is referred to as 'variance of LD(fS)', how do we show these are correlated? How do we extend the notion of variation of one point's prediction to the variance of expected risk over the distribution D?
Thanks
Add comment