### Final Exam - Problem 17

Hello! I can't understand exactly the solution of this problem. Following the graphic of the neural network, it seems that at the nodes of the hidden layer the bias terms are added after the weight has been applied to the input, therefore I would write that:
$$x_1^{(1)} = \sigma (w_1 x + b_1)$$ and $$x_2^{(1)} = \sigma (w_2 x + b_2)$$
In this way, at the last step, when I have to choose the value to associate to $$w_1$$ and $$w_2$$, I would have picked them very large, so that the transition of the sigmoid is steep, while I would have chosen $$b_1 = -1/3$$ and $$b_2 = -2/3$$, so that $$b_1$$ and $$b_2$$ are independent of $$w_1$$ and $$w_2$$. Am I wrong in doing so?

Hi,

what I understood from lecture08.b is that a transition in

$$\sigma(w(x-b))$$

happens at b. So to have a transition at

$$\frac{1}{3}$$

we need

$$b_1 = -\frac{1}{3}w_1$$

@sepideh said:
Hi,

what I understood from lecture08.b is that a transition in

$$\sigma(w(x-b))$$

happens at b. So to have a transition at

$$\frac{1}{3}$$

we need

$$b_1 = -\frac{1}{3}w_1$$

How to get b3 in this problem 17?

Page 1 of 1