Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Lecture 6b knn upper bound

Hi,

In the poly 6b_knn page 9, we have a formula that gives an upper bound on the expectation of the loss on the training set. However, isn't KNN supposed to have a training loss = 0 ?
Also, I think that N^-(1/(d+1)) is not growing exponentially with d. Should we keep this formula ?

Best regards,

Ali

Top comment

Hi Ali,

This is a very important point: the bound is not on the expectation of the loss on the train set, but on the expectation of the true loss of the KNN classifier, where the expectation is taken with respect to the random train set used to implement the KNN classifier.
Typically, this bound is telling you how well in average over the train set you use to build your KNN classifier, such a classifier is working.

For your second point, if you double d then you will have to square your number of samples to get a constant term. This is why we say that it is growing exponentially with d.

Best,
Nicolas

Thank you for your answer. It is much more clear now.

Ali

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification