Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Normalise logistic loss by number of datapoints

Hello.

Is it allowed to normalise the loss (divide by N) returned by the functions implementing logistic regression? I ask because the functions will be tested automatically for correctness.
We would like to do this so as to make SGD actually work: if not, SGD will (stochastically) approximate the direction of the gradient, but not the magnitude, thus requiring a different gamma and lambda for each N.

Thank you in advance.

For the mandatory function (in implementations.py), it should use the loss and gradient as in the labs (not divided by the number of samples). This will be automatically graded on a small dataset like the one you saw in the labs. The test script will pass gamma and lambda as input parameters to the functions (as per the signature given project description).

You can use a different function for the Higgs challenge dataset, for this it can use the loss and gradient divided by number of samples. This could be a better choice for this scenario because of the reasons you mentioned.

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification