ANN Introduction
ANN Introduction
ANN Introduction
– Synapse: thin gap between axon of one neuron and dendrite of another.
• Signal exchange
• Synaptic strength/efficiency
Artificial neural networks are:
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
ANN is the human idealisation of the real networks of neurons
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Advantages of ANN
Able to derive meaning from complicated or imprecise
data e.g. to extract patterns and detect trends that are
too complex for human or computer techniques.
As the inputs and output are external, the parameters of this model
are therefore the weights, bias and activation function and thus
DEFINE the model
Processing Units in ANN
Types of units:
Input units that receive data from outside of the neural
network
Output units that send data out of the neural network
Hidden units whose input and output signals remain within the
neural network.
A simple example of ANN application
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
A simple ANN architecture for the example
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Network Architecture
The ANN has an input layer, hidden layer and the output layer. It is
called Multi Layer Perception (MLP)
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Activation function
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Similarly, the hidden layer leads to the final prediction at
the output layer:
O3 = 1 / (1+exp(-F 1))
Where F 1= W7*H1 + W8*H2
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Weights
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
A good model with high accuracy gives predictions that are very close to
the actual values. Here, Column X values should be very close to Column
W values. The error in prediction is the difference between column W and
column X:
Source: https://towardsdatascience.com/introduction-to-neural-networks-
advantages-and-applications-96851bd1a207
Optimisation
The key to get a good model with accurate predictions is to
find “optimal values of W—weights” that minimizes the
prediction error. This is achieved by “Back propagation
algorithm” and this makes ANN a learning algorithm because
by learning from the errors, the model is improved.