Accuracy Assessment and Confusion Matrix
Accuracy Assessment and Confusion Matrix
Accuracy Assessment and Confusion Matrix
Module1_3
Regression and Evaluation Metrics
[email protected]
Topics to be covered
Confusion Matrix
F1 Score
• F1-Score is the harmonic mean of precision
and recall values for a classification problem.
The formula for F1-Score is as follows:
Area Under the ROC curve
(AUC – ROC)
• The biggest advantage of using ROC curve is
that it is independent of the change in
proportion of responders.
• Let’s first try to understand what is ROC
(Receiver operating characteristic) curve. If we
look at the confusion matrix below, we
observe that for a probabilistic model, we get
different value for each metric.
(AUC – ROC)
The ROC curve is the plot between sensitivity and (1- specificity). (1- specificity) is
also known as false positive rate and sensitivity is also known as True Positive
rate. Following is the ROC curve for the case in hand.
Polynomial Regression
• A regression equation is a polynomial regression equation if the power of
independent variable is more than 1.
• In this regression technique, the best fit line is not a straight line. It is rather a
curve that fits into the data points.
Underfitting/Overfitting