Problem 18 k-nearest neighbor classifier exam 2018

Does anyone have any ideas how to do questions 3 (value of k that minimizes leave-one-out cross-validation error) and 4 (sketch 1-nearest decision boundary for the dataset) in problem 18? Totally stuck with it. I would be very grateful if someone shared ideas of solution.


For the q3, you need to check the label predicted by the knn with k=1,3,5 ... (It is mentionned to take k odd) for all training point. E.g. for k=3 you see the 3 nearest points of each training point, if there is at least 2/3 labels which are different of the label of your point you increase your error by 1 for all misclassified.

For q4 take a look at:

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification