Lecture 9 - Supervised Learning in ANN - (Part 2) New
Lecture 9 - Supervised Learning in ANN - (Part 2) New
Lecture 9 - Supervised Learning in ANN - (Part 2) New
Qadri Hamarsheh
1
Dr. Qadri Hamarsheh
Where
o Calculate the error gradient for the neurons in the hidden layer:
2
Dr. Qadri Hamarsheh
The initial weights and threshold levels are set randomly as follows:
w13 = 0.5, w14 = 0.9, w23 = 0.4, w24 = 1.0, w35 = -1.2, w45 = 1.1,
𝜽3 = 0.8, 𝜽4 = -0.1 and 𝜽5 = 0.3.
We consider a training set where inputs x1 and x2 are equal to 1 and
desired output yd,5 is 0.
The actual outputs of neurons 3 and 4 in the hidden layer are calculated as
Now the actual output of neuron 5 in the output layer is determined as:
3
Dr. Qadri Hamarsheh
The next step is weight training. To update the weights and threshold
levels in our network, we propagate the error, e, from the output layer
backward to the input layer.
First, we calculate the error gradient for neuron 5 in the output layer:
Then we determine the weight corrections assuming that the learning rate
parameter, 𝜶, is equal to 0.1:
Next we calculate the error gradients for neurons 3 and 4 in the hidden
layer:
The training process is repeated until the sum of squared errors is less
than 0.001.
4
Dr. Qadri Hamarsheh
5
Dr. Qadri Hamarsheh
Decision boundaries
𝟎 ≤ 𝜷 < 𝟏; Typically 𝜷 = 𝟎. 𝟗𝟓
This equation is called the generalized delta rule.
Learning with momentum for operation Exclusive-OR: 126 Epochs
2) Biological neurons do not work backward to adjust the synaptic weights,
so Backpropagation Algorithm can’t emulate brain-like learning.
6
Dr. Qadri Hamarsheh