Connect your moderator Slack workspace to receive post notifications:
Sign in with Slack

2020 Q10

bild_2022-01-12_160836.jpeg
Why cant L>L' and M > K be a solution if L = L', M = K is one, cant you just arbitrary add row of zeroes to the weight to increase the dimension and get same result as well as add an extra layer where weight are zero everywhere expect for their own position where they are 1? Wouldnt lead to an MLP with the exact same behaviour as before, or am I missing something?

Top comment

About the M > K, I guess that using extra nodes with weight = 0 does not really count. You probably need at least one incoming edge and one outgoing edge to count it as a node. i.e. you should have a path from the input to the output which includes this node.

About the L > L', you want to apply the activation function the same number of times.

Thanks makes sense!

However I still think that with L = L' and M >= K it is possible. Nothing forbids to have nodes with no edges in the formulation of the question.

Page 1 of 1

Add comment

Post as Anonymous Dont send out notification