Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

CV ridge regression

Hello,

While doing the cross validation for the ridge regression, I tried to plot the minimal loss for each degree (for train and data set). I obtained the two following plots, one with a null lambda (no penalty) and the other for a very high lambda (lambda=100). However I didn't get the curves I expected... is it normal that in both cases the loss of the test set gets very big for high degrees ? Moreover both plots follows the same trend, although one is penalized and the other one is not. Do you have any ideas on how I could solve my problem? Thank you very much.

1.jpg
2.jpg

I think that's indeed a bit unexpected, but can still occur.

If you want to make a further sanity check, you can try to fix the polynomial degree to 14 and then plot the dependency with respect to multiple lambda values (preferably, on the logarithmic scale) and see whether you get a meaningful trend.

Hello!
I have the same problem. Did you find any mistakes or understand why it looks so?

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification