Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Extended features for every ML method

We saw that it was possible to use an augmented feature vector to fit the model better.
I was wondering if it was making sense to use this augmentation for every ML method.
In particular, I was wondering if we could apply it in GD and SGD and if it was a common thing to do since the computation of the gradient is already expensive, adding more features seems intuitively wrong to me. In short, can we use augmented feature vector with GD/SGD?

Hi,
Yes, you can use data augmentation in the form of a polynomial basis function for GD/SGD. Data processing is often agnostic to the optimization algorithm.
As you stated, there is a trade-off between model complexity and run-time. However, you can try to find the sweet spot where the model complexity is high enough and the training/inference is not extremely slow. You can also check feature selection algorithms if you want to reduce the number of features. Also, in SGD, computing gradient is cheaper than GD.
Best,
Semih

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification