ML Video
ML Video
ML Video
Linear Regression:
● Hypothesis Function:
● Objective: Fit the data by minimizing the error between predictions and actual values.
Cost Function:
● Measures the squared error between predicted and actual values.
Computing partial derivatives of the cost function with respect to each variable to minimize the
error.
Feature Scaling:
● Standardizes input features for faster convergence during optimization.
.
● Method: Mean Normalization:
Learning Rate: α
Normal equation:
Hypothesis :
● The hypothesis hθ(x) models the relationship between inputs and output .
● hθ(x)= 1/(1+e^-(θt(x)
Decision Boundary:
● Separates data classes based on the hypothesis.
● If hθ(x) >=0.5; then output y=1.
● If hθ(x) <0.5; then y=0.
One-vs-All Classification:
One-vs-All trains a separate binary classifier for each class.
Octave:
Language that help plotting data, performing matrix operations for machine learning algorithms.
Neural Networks:
● Neural networks are models inspired by the human brain, designed to process inputs,
learn patterns, and make predictions.
● It is capable of learning both linear and non-linear relationships in data
Non-Linear Hypothesis
Neural networks create non-linear hypotheses, enabling them to handle complex problems like
classifying patterns.
Model Representation:
● Receives raw features.
● Perform transformations using weights, biases, and activation functions.
● Produces prediction
Examples and Intuitions:
Binary Classification: Spam vs. Not Spam.
Non-Linear Problems: Solving tasks like the OR, XOR problem, where linear models fail.
Multi-Class Classification: Recognizing handwritten digits
Multi-Class Classification:
Multi-class classification is the task of predicting one label from three or more possible classes.
Cost Function
The cost function measures how well the neural network's predictions match the actual labels.
● Binary Classification: Measures the error for outputs in [0,1].
● Multi-Class Classification: K classes.
Backpropagation Algorithm:
Backpropagation calculates gradients of the cost function
● Perform forward propagation to compute predictions.
● Calculate errors at the output layer.
● Propagate the error backward through the layers, adjusting weights and biases.
Gradient checking
Gradient checking ensures the correctness of backpropagation by comparing analytically
computed gradients to numerical approximations.
Example:
Neural network used for lane detection, object recognition, and path planning in autonomous
vehicles.
Model Selection:
● Training Set:60%
● Cross validation set:20%
● Test Set: 20%
Bias(Underfit):
Variance:(Underfit):
Regularization :
Learning Curves:
If a learning algorithm is suffering from high bias getting more data will not help much.
Error Metrics:
F1= 2* (P * R)/(P+R).
● SVM tries to find a hyperplane that maximizes the margin between classes.
● The points closest to the hyperlane are called support vector.
Clustering:
K-Means Algorithm: Alternates between assigning points to the nearest centroid and
updating centroids.
Look for the point where adding more clusters doesn't significantly reduce the within-cluster
variance.