Presentation Group 4

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 12

PRESENTATION # 4

SHEZA ASIF (510)


BUSHRA JABBAR (517)
MARYAM M YAR (523)
SADIA TABASSUM(542)
IFRAH KHAN(524)

SOBIA MANZOOR(528)
SOBIA AKMAL(547)
NEURAL NETWORK

• An artificial neural network (ANN) is an information processing parogram that is


inspired by biological nervous system .
• It is composed of a large number of highly interconnected processing elements known as
neurons.
• ANN is used in our daily basis application that we use e.g. google translation , google
assistant
• It is the study to makes computer sensible decisions and to learn by ordinary experiences
as we do.
CONT..

• In ANN we want to make machines in which we implement artificial neurons like


neurons in brain.
• Input algorithm meaningfull output

Learning and neural


improvement network
CONT..

• The human brain consist has about 100 billions neurns and 100 trillions connections
between them
• Here a typical neuron look likes
SINGLE LAYER NETWORK.

• A single-layer neural network, also known as a perceptron, is the simplest type of artificial
neural network. It consists of just one layer of neurons (nodes) where each neuron takes
some input values, processes them, and produces an output. Here's a simplified breakdown:
• Inputs: The network receives several inputs, like numbers representing data.
• Weights: Each input is multiplied by a weight, which adjusts the importance of that input.
• Summation: All the weighted inputs are added together.
• Activation Function: The sum is passed through an activation function, which decides the
final output of the neuron (like yes/no or true/false).
PERCEPTRON ALGORITHM

• Input Layer- Multiple input nodes (features)- Each node represents a feature (e.g., x1, x2, x3).
• Weight Connections- Each input node is connected to the output node- Each connection has a weight (w1, w2, w3).
• Output Node (Neuron)- Receives weighted sum of inputs.
• Output- Binary classification (0 or 1, yes or no).
• Summing Function: The summing function, also known as the net input function, calculates the weighted sum of the inputs. It
takes the input values, multiplies them by the corresponding weights, and adds them together. The formula for the summing
function is
• net_input = w1_x1 + w2_x2 + … + wn*xn
• where:- net_input is the weighted sum of the inputs-
• w1, w2, …, wn are the weights-
• x1, x2, …, xn are the input values
CONT….

• Activation Function:The activation function, also known as the transfer function, takes
the output of the summing function and maps it to a output value between 0 and 1. The
most common activation functions used in Perceptron are
MULTI LAYER NETWORK.
CONT…

• A multilayer neural network, also known as a feedforward neural network or a deep


neural network, is a type of artificial neural network with multiple layers of artificial
neurons, including at least one hidden layer between the input and output layers. These
networks are capable of learning and representing complex patterns and relationships in
data, making them suitable for a wide range of machine learning tasks, including image
recognition, natural language processing, and more.
CONT……

• Input Layer: The input layer consists of input neurons, each representing a feature or
attribute of the input data. These neurons pass the input data to the hidden
layers for processing.
• Hidden Layers: Multilayer neural networks have one or more hidden layers positioned
between the input and output layers. Each hidden layer contains multiple artificial
neurons (also called nodes or units). These neurons perform weighted sum and activation
functions, processing the information from the previous layer. The activation functions in
hidden layers are typically nonlinear, such as the sigmoid (logistic) function, ReLU
(Rectified Linear Unit), or others.
CONT….

• Output Layer: The output layer produces the final predictions or results of the network.
The number of neurons in the output layer depends on the specific task. For example, in a
binary classification task, there may be two output neurons representing two classes. In a
regression task, there could be a single output neuron for a continuous prediction.
THANK U…

You might also like