Week 8 - ANN
Week 8 - ANN
Week 8 - ANN
https://www.youtube.com/watch?v=b2ctEsGEpe0
https://www.youtube.com/watch?v=qPix_X-9t7E
• An Artificial Neural Network is a network of
interconnected artificial neurons, where each
neuron represents a computing unit.
• These neurons interact with each other and are
connected in various ways.
• Each node receives input and performs some
operations on it before transmitting.
• Input
It is the information coming from the outside world or another cell to the artificial neural networks.
• Weghits
It represents the numerical value of the connections between cells. It shows the value of the information received in a cell
and its effect on the cell.
• Summation Function
It provides the calculation of the net input of that cell by multiplying the inputs to the cell with the weights and summing
them.
• Activation Function
It processes the net input to the cell and determines the output that the cell will produce in response to this input.
• Output
They are the output values determined by the activation functions. The produced output can be sent either to the outside
world, to another cell, or as an input to itself.
Perceptron, The Keystone
• Perceptron is an algorithm
that mimics the biological
neuron.
• Its components: Inputs,
weights of inputs, sum of
weights function, activation
function, activation threshold
and output.
THE PERCEPTRON : FORWARD PROPAGATION
THE PERCEPTRON : FORWARD PROPAGATION
ACTIVATION FUNCTION
y=Activation(∑(w*x+b))
The activation function is used here to control the y value, that is, to decide whether a neuron will be active or not.
Step Function:Produces a binary classification output (0 or 1) based on a threshold value.
Sigmoid Function : It is one of the most widely used activation functions, it produces output in the range of [0,1].
Tanh Function : It is a nonlinear function that produces output in the range [-1,1].
ReLU Function : Rectified Linear Unit (RELU) is a nonlinear function. The ReLU function takes the value 0 for
negative inputs, while x takes the value x for positive inputs.
Softplus Function :
ELU Function :
PReLU
ACTIVATION FUNCTION
Activation functions are functions that decide what the node's output should be, given the inputs in
the node.
We often refer to a layer's outputs as "activations", as it is the activation function that decides the
actual output.
This function returns 0 if the linear combination is less than 0. Returns 1 if linear combination is
positive or equal to zero (Step function).
Activation Functions
▪ In order for the y neuron to become active, the y-input value must reach a certain
threshold value accompanied by a function.
▪ So, y=f(y-input). The most widely used activation function is the S-shaped logistic
1
sigmoid function. f ( x) =
1 + e −x
e x − e−x
Another activation function is for all x values.; f ( x) = x
e + e−x
1 if x
f ( x) =
Hard transition activation function;
0 if x
Sigmoid Activation Function
1
f ( x) =
1 + e −x
Sigmoid
1,2
1
0,8
0,6
0,4
0,2
0
-5 -4 -3 -2 -1 0 1 2 3 4 5
X
Hard Transition Activation Function(step Function)
Sert Geçişli
1,2
1
1 if x 0,8
f ( x) =
0 if x
0,6
0,4
0,2
0
-5 -4 -3 -2 -1 0 1 2 3 4 5
x
Hyperbolic Tangent Activation Function
Hiperbolik Tanjant
1,5
1,0
e x − e−x
f ( x) = x
0,5
e + e−x 0,0
-5 -4 -4 -3 -3 -2 -2 -1 -1 -0 0 0, 1 1, 2 2, 3 3, 4 4, 5
-0,5
-1,0
-1,5
x
Activation
Functions
WEGHITS (W)
When input data arrives at a neuron, it is multiplied by a weight value assigned to that particular
input.
Weight usage:
These weights start out as random values, and as the neural network learns more about what kind
of input data leads to a student being admitted to the university, the network adjusts the weights
for any categorization errors caused by the previous weights. This process is to train the neural
network.
Let's relate the weight to m (slope) in the original linear equation.
Artificial Neural Network
X1 Z1
𝑦 = 𝑥1 𝑤1 + 𝑥2 𝑤2 +𝑥3 𝑤3
ANN Models • Feed Forward
Cells are in the form of regular layers from the entrance to
According to the exit. The information coming to the network passes
through the input layer, then through the hidden layers and
their the output layer, respectively, and then goes out to the
outside world.
Structures • Feedback
The output of a cell is not only given as input to the layer that
follows it. It can be given as an input to any cell in the
previous layer or in its own layer.
Feed Forward & Feedback ANNs
FEED FORWARD ANN
The w weights used in feed forward artificial neural networks are corrected and
renewed w each time.
= w1 + w1
new old
w1
1 2 1
Er = e = ( g − y) 2
2 2
Learning coefficient that takes
a value in the range of 0 - 1
Er
w = −
w
Feed Forward ANN
Hidden Layer Output Layer
wijgizli
j j
S hidden
j , O hidden
j S ioutput , O ioutput
Cross section of hidden and output layers in neural network
Oicik ti = ( S icik ti )
n gizli
S i
cikti
= w gizli
ij O gizli
j
j =1
Feed Forward
Feed Forward
Example
Feed Forward - Example
Feed Forward - Example
Same way:
Feed Forward
Calculating Error
Previously calculated
Feed Forward
Is calculated and:
Total Error
Backpropagation Algorithm
- error in hidden layers
The ideas of the algorithm can be summarized as
follows:
Identification of samples