diff --git a/Exercise4/exercise4.ipynb b/Exercise4/exercise4.ipynb index d8ebee0c..a325f6de 100755 --- a/Exercise4/exercise4.ipynb +++ b/Exercise4/exercise4.ipynb @@ -663,7 +663,7 @@ "$$ \\delta^{(2)} = \\left( \\Theta^{(2)} \\right)^T \\delta^{(3)} * g'\\left(z^{(2)} \\right)$$\n", "Note that the symbol $*$ performs element wise multiplication in `numpy`.\n", "\n", - "1. Accumulate the gradient from this example using the following formula. Note that you should skip or remove $\\delta_0^{(2)}$. In `numpy`, removing $\\delta_0^{(2)}$ corresponds to `delta_2 = delta_2[1:]`.\n", + "1. Accumulate the gradient from this example using the following formula. Note that you should skip or remove the first column of $\\Theta^{2}$. In `numpy`, removing first column corresponds to `Theta2 = Theta2[:,1:]`.\n", "\n", "1. Obtain the (unregularized) gradient for the neural network cost function by dividing the accumulated gradients by $\\frac{1}{m}$:\n", "$$ \\frac{\\partial}{\\partial \\Theta_{ij}^{(l)}} J(\\Theta) = D_{ij}^{(l)} = \\frac{1}{m} \\Delta_{ij}^{(l)}$$\n",