Presentation Group 4
Presentation Group 4
Presentation Group 4
SOBIA MANZOOR(528)
SOBIA AKMAL(547)
NEURAL NETWORK
• The human brain consist has about 100 billions neurns and 100 trillions connections
between them
• Here a typical neuron look likes
SINGLE LAYER NETWORK.
• A single-layer neural network, also known as a perceptron, is the simplest type of artificial
neural network. It consists of just one layer of neurons (nodes) where each neuron takes
some input values, processes them, and produces an output. Here's a simplified breakdown:
• Inputs: The network receives several inputs, like numbers representing data.
• Weights: Each input is multiplied by a weight, which adjusts the importance of that input.
• Summation: All the weighted inputs are added together.
• Activation Function: The sum is passed through an activation function, which decides the
final output of the neuron (like yes/no or true/false).
PERCEPTRON ALGORITHM
• Input Layer- Multiple input nodes (features)- Each node represents a feature (e.g., x1, x2, x3).
• Weight Connections- Each input node is connected to the output node- Each connection has a weight (w1, w2, w3).
• Output Node (Neuron)- Receives weighted sum of inputs.
• Output- Binary classification (0 or 1, yes or no).
• Summing Function: The summing function, also known as the net input function, calculates the weighted sum of the inputs. It
takes the input values, multiplies them by the corresponding weights, and adds them together. The formula for the summing
function is
• net_input = w1_x1 + w2_x2 + … + wn*xn
• where:- net_input is the weighted sum of the inputs-
• w1, w2, …, wn are the weights-
• x1, x2, …, xn are the input values
CONT….
• Activation Function:The activation function, also known as the transfer function, takes
the output of the summing function and maps it to a output value between 0 and 1. The
most common activation functions used in Perceptron are
MULTI LAYER NETWORK.
CONT…
• Input Layer: The input layer consists of input neurons, each representing a feature or
attribute of the input data. These neurons pass the input data to the hidden
layers for processing.
• Hidden Layers: Multilayer neural networks have one or more hidden layers positioned
between the input and output layers. Each hidden layer contains multiple artificial
neurons (also called nodes or units). These neurons perform weighted sum and activation
functions, processing the information from the previous layer. The activation functions in
hidden layers are typically nonlinear, such as the sigmoid (logistic) function, ReLU
(Rectified Linear Unit), or others.
CONT….
• Output Layer: The output layer produces the final predictions or results of the network.
The number of neurons in the output layer depends on the specific task. For example, in a
binary classification task, there may be two output neurons representing two classes. In a
regression task, there could be a single output neuron for a continuous prediction.
THANK U…