Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

"It is good to let your model overfit"

Hello,

I was quite surprised by this sentence (Q19) when I read it is correct. Therefore, I looked up a bit online, and I found that when dealing with neural nets, we more often underfit than overfit. Is it why the answer is correct ?

Thanks for your help

(this is exam 2017)

the full sentence in the correct box is
"In practice, it could be good to let your model first overfit your task, and then apply drop-out or other regularization techniques."

this is sth very recommended by practitioners, to understand in a first step: whether to perfectly fit the data 1) your model actually is expressive/powerful enough and 2) your optimization algorithm works as it should for that model.
after this is verified the only task remaining is to reduce the overfitting, which is in practice typically easy by dropout or other regularization.

the workflow in the other direction from a badly fitting model is just less practical (we don't know why the model fits badly or what went wrong in training)

In this question, I was also surprised that 2 answers were correct, despite the fact that the question suggested only one correct answer: "which of the following statements is correct ?"
Given the fact that these MCQ are with negative points, is it possible to ensure that the questions can't mislead us ?

this year every MC question has exactly one correct answer only

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification