Artificial Neural Networks
Artificial Neural Networks
Artificial Neural Networks
Soma
Dendrites
Synapse
Dendrites
Axon
Soma
Synapse
Nucleus
Neural Network
Neural Network
Neural Network
Information is stored and processed in a neural
Weights
Input Signals
w1
w2
Raw data/outputs
of other neurons
Y
Neuron
Y
Y
wn
Computes activation
level
Final solution/input
of other neurons
Input Signals
Middle Layer
Input Layer
Output Layer
Weights
Output Signals
x1
w1
x2
w2
Neuron
wn
xn
X = xi wi
i =1
+ 1, if X
Y =
1, if X <
Sign function
Sigmoid function
Linear function
+1
+1
+1
+1
-1
0
-1
X
-1
1, if X 0
+1, if X 0 sigmoid
step
sign
Y
=
=
Y
=
Y
0, if X < 0
1, if X < 0
X
-1
1
1+ e X
Y linear= X
Linear
Combiner
Hard
Limiter
w2
x2
Output
Y
Threshold
The Perceptron
xi wi = 0
i =1
x2
Class A 1
1
x1
Class A 2
x1
2
x1 w 1 + x2 w 2 = 0
(a) Two-input perceptron.
x3
x1 w 1 + x2 w 2 + x3 w 3 = 0
0 or 0 = 0
0 or 1=1
1 or 0 = 1
1 or 1 =1
X1 or x2 = y
0 and 0 = 0
0 and 1= 0
1 and 0 = 0
1 and 1 =1
X1 and x2 =y
e( p) = Yd ( p) Y ( p)
where p = 1, 2, 3, . . .
n
Y ( p ) = step x i ( p ) w i ( p )
i = 1
wi ( p ) = xi ( p ) e( p )
Step 4: Iteration
Increase iteration p by one, go back to Step 2 and
repeat the process until convergence.
Inputs
Desired
output
Initial
weights
Actual
output
Error
Final
weights
x1
x2
Yd
w1
w2
w1
w2
0
0
1
1
0
1
0
1
0
0
0
1
0.3
0.3
0.3
0.2
0.1
0.1
0.1
0.1
0
0
1
0
0
0
1
1
0.3
0.3
0.2
0.3
0.1
0.1
0.1
0.0
0
0
1
1
0
1
0
1
0
0
0
1
0.3
0.3
0.3
0.2
0.0
0.0
0.0
0.0
0
0
1
1
0
0
1
0
0.3
0.3
0.2
0.2
0.0
0.0
0.0
0.0
0
0
1
1
0
1
0
1
0
0
0
1
0.2
0.2
0.2
0.1
0.0
0.0
0.0
0.0
0
0
1
0
0
0
1
1
0.2
0.2
0.1
0.2
0.0
0.0
0.0
0.1
0
0
1
1
0
1
0
1
0
0
0
1
0.2
0.2
0.2
0.1
0.1
0.1
0.1
0.1
0
0
1
1
0
0
1
0
0.2
0.2
0.1
0.1
0.1
0.1
0.1
0.1
0
0
1
1
0
1
0
1
0
0
0
1
0.1
0.1
0.1
0.1
0.1
0.1
0.1
0.1
0
0
0
1
0
0
0
0
0.1
0.1
0.1
0.1
0.1
0.1
0.1
0.1
Final Weight
is used
in the next
pattern p (each line)
x2
x2
1
x1
x1
0
(a) AND (x 1 x 2 )
(b) OR (x 1 x 2 )
x1
0
(c) Exclusive-OR
(x 1 x 2 )
Input Signals
Input
layer
First
hidden
layer
Second
hidden
layer
Output
layer
x1
x2
xi
y1
y2
yk
yl
wij
wjk
m
n
xn
Input
layer
Hidden
layer
Error signals
Output
layer
, +
Fi
Fi
Step 2: Activation
Activate the back-propagation neural network by
applying inputs x1(p), x2(p),, xn(p) and desired
outputs yd,1(p), yd,2(p),, yd,n(p).
(a) Calculate the actual outputs of the neurons in
the hidden layer:
n
y j ( p ) = sigmoid x i ( p ) wij ( p ) j
i =1
y k ( p ) = sigmoid x jk ( p ) w jk ( p ) k
j =1
where
ek ( p ) = y d , k ( p ) y k ( p )
j ( p ) = y j ( p ) [1 y j ( p )] k ( p ) w jk ( p )
k =1
Step 4: Iteration
Increase iteration p by one, go back to Step 2 and
repeat the process until the selected error criterion
is satisfied.
As an example, we may consider the three-layer
back-propagation network. Suppose that the
network is required to perform logical operation
Exclusive-OR. Recall that a single-layer perceptron
could not do this operation. Now we will apply the
three-layer net.
w13
1
w35
w23
5
5
x2
w14
w45
w24
Input
layer
4
1
Hidden layer
Output
layer
y5
[
) = 1 / [1 + e
]
] = 0.8808
e = yd ,5 y5 = 0 0.5097 = 0.5097
10
Sum-Squared Error
10
-1
10
-2
10
-3
10
-4
10
50
100
Epoch
150
200
Desired
output
x1
x2
yd
1
0
1
0
1
1
0
0
0
1
1
0
Actual
output
y5
0.0155
0.9849
0.9849
0.0175
Error
e
0.0155
0.0151
0.0151
0.0175
Sum of
squared
errors
0.0010
x1
+1.0
1
+1.0
+1.0
+0.5
5
x2
+1.0
+1.0
+1.0
4
+0.5
1
y5
Decision boundaries
x2
x2
x2
x 1 + x 2 1.5 = 0
x 1 + x 2 0.5 = 0
1
x1
x1
0
1
(a)
0
(b)
x1
1
0
(c)