It seems I can't wrap my head around why in this case: "linear regression can be made to work perfectly"
but in this case the answer is "none of the above"
thank you for the clarification :)
Hi, I had the same problem but I finally draw these conclusions:
I would agree if the statement wasn't "linear regression problem" but "classification problem" for the first one :/
Yes I agree that's why I first also didn't understand but since y is defined this way it can only be classification I think. In addition logistic regression contains the word "regression", however it is a classification algorithm so.. that's the best explanation I could find to convince myself haha
well thats actually why i thought linear regression wouldn't work since it would return continuous predictions for a binary output- sometimes I don't really understand what the qcm are asking. Like i feel there should have been a box like linear regression isn't suited for these outputs or something similar
I went through this part of the course again (Classification lecture of week 5) and it's explained that we can indeed do classification with linear regression, but then it's explained that is is very much dependent on the data even if classes are well separated. So, indeed there are some cases where it can work very well and some where it doesn't but the sentence is correct since it doesn't say "it always works" but says "we can make it work" so I guess this is it.
well MCQ tend to be quite harsh in expecting very detailed knowledge. In this case we can do a linear regression and then quantize it to make it a classification problem but it isn't what the MCQ explicitely states. I feel this could be a true false question saying is LPM a regression problem and we would have to say False its a classification problem.. I understand of course the reasoning when working backwards but on the exam I would not have expected this to be their meaning. Good luck for tomorrow everyone :)
good luck :)