If you know the prior distribution, yes, but otherwise, I don't think any method can be proven to have the same error probability, except maybe under impossible circumstances, e.g. with maximum likelihood, if the estimator is consistent, with an infinite number of samples, you converge to the true parameters
Bayes vs training !
Helloooo
I was wondering....
Can the training error be equal to the Bayes classifier without over-fitting ?
If you know the prior distribution, yes, but otherwise, I don't think any method can be proven to have the same error probability, except maybe under impossible circumstances, e.g. with maximum likelihood, if the estimator is consistent, with an infinite number of samples, you converge to the true parameters
1
Add comment