For the question 11 it is stated that by flipping the signs of all the weights leading in and out of a hidden neuron, the input-output mapping function represented by the network is unchanged. Is this because the tanh(x) is odd so if we have w1 leading to x and w2 leading out of x we have: -w2(-w1x) = -w2(-(w1x)) = w2(w1x)?
Also, it is stated that interchanging the values of all the weights leave the network unchanged, but this is true for all networks right?
thank you for the clarifications.
Question 11 exam 2020
Hello,
For the question 11 it is stated that by flipping the signs of all the weights leading in and out of a hidden neuron, the input-output mapping function represented by the network is unchanged. Is this because the tanh(x) is odd so if we have w1 leading to x and w2 leading out of x we have: -w2(-w1x) = -w2(-(w1x)) = w2(w1x)?
Also, it is stated that interchanging the values of all the weights leave the network unchanged, but this is true for all networks right?
thank you for the clarifications.
Add comment