Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Bias-Variance decomposition

I think I got the right figure just as the exercise pdf shows. However, what I do not understand is, isn't this is just the loss-vs-parameter figure? Why this loss figure demonstrate the bias-variance decomposition? I am not very clear about the relationship between them.

Thanks.

Depends on which figure you're talking about.

Let's assume we're talking about the bias-variance figure (below). On the x-axis is the hyperparameter which controls the degree of the polynomial, i.e. model complexity, and on the y-axis the train/test error (loss).

When we are in a (too) low or (too) high model complexity case, the test error is high.

  • Too low model complexity, this is due to high bias (underfitting): high test error AND high train error
  • Too high model complexity, this is due to high variance (overfitting): high test error BUT low train error

We wish to be somewhere in the middle, where the test error is the lowest = where the model complexity is just right for the data.

Screen Shot 2020-10-09 at 19.03.54.jpg

To see what's happening under the hood (in function space), this video is good: https://www.youtube.com/watch?v=EuBBz3bI-aA

Thank you for your reply. I understand the relationship between bias-variance and model complexity. I mean, maybe plotting the figure like, a line showing the bias, a line showing the variance, and a line showing the test error will be more convincing, just like the last figure in the lecture note shows. But I have no idea how to draw such a picture :)

I mean, maybe plotting the figure like, a line showing the bias, a line showing the variance, and a line showing the test error will be more convincing, just like the last figure in the lecture note shows.

The bonus questions in exercise 2 (problem set 4) will help you with that

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification