Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

Cross validation for neural networks

Good morning,
I had a more general question on neural networks.
I was wondering how to perform cross validation for any neural network, and to cast what we've learned about cross validation into that perspective. More specifically, do the numbers of hidden layers and hidden nodes need to be cross validated ? Is it specific to the type of neural network I want to use ?
I have some troubles finding the hyperparameters to train...
Thank you very much for your help !

Hi,
Some hyperparameters for the deep neural networks:
1) Number of layers (depth of the network)
2) Number of hidden neurons (width of the network)
3) Activation function used
4) Connectivity between layers (for example fully connected, a convolutional layer of graph convolution)

As training a neural network takes a fair bit amount of time, practically cross-validating a neural network is not possible unless the scale of the problem is really small. However, not all the hyperparameters are really important, for example, choosing the activation function is rather trivial (a Relu activation works well almost under all settings). In summary, you can cross-validate your network by trying different numbers of layers while fixing your number of hidden neurons, and finding the optimal number of neurons while fixing the depth of your network to some sensible value. And ignoring the hyperparameters such as activation functions. A much fancier version of this idea can be implemented as Bayesian optimization (https://en.wikipedia.org/wiki/Bayesian_optimization).

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification