You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In file 3_net.py, everything seems to go fine but the problem is, the weights w_h are not getting updated.
They have the same random values which were initialized at the beginning in all iterations of training.
In the first iteration, (only a subset of w_h and w_o values are printed )
and in the 25th iteration
Iteration: 25 Cost: 2.24883255277
w_h: [ 0.026779 -0.00800639 -0.01096162 -0.00354766 0.00137482] w_o: [-0.0131051 -0.00619329 0.01693202 -0.00053586 -0.00538656]
This was not even the problem of sigmoid activation unit due to vanishing gradients because
same problem was faced even when tanh and ReLU activation units were used.
The text was updated successfully, but these errors were encountered:
In file 3_net.py, everything seems to go fine but the problem is, the weights
w_h
are not getting updated.They have the same random values which were initialized at the beginning in all iterations of training.
In the first iteration, (only a subset of
w_h
andw_o
values are printed )Iteration: 1 Cost: 2.29556935362
w_h: [ 0.026779 -0.00800639 -0.01096162 -0.00354766 0.00137482] w_o: [-0.01249968 -0.00815679 0.01787239 -0.00080373 -0.00373115]
and in the 25th iteration
Iteration: 25 Cost: 2.24883255277
w_h: [ 0.026779 -0.00800639 -0.01096162 -0.00354766 0.00137482] w_o: [-0.0131051 -0.00619329 0.01693202 -0.00053586 -0.00538656]
This was not even the problem of
sigmoid
activation unit due to vanishing gradients becausesame problem was faced even when
tanh
andReLU
activation units were used.The text was updated successfully, but these errors were encountered: