Compute The Mean For Each Class

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

1.

Compute the mean for each class:


For Class 1:
𝜇1=(1+2+33,2+3+43)=(2,3)μ1=(31+2+3,32+3+4)=(2,3)
For Class 2:
𝜇2=(5+6+73,6+7+83)=(6,7)μ2=(35+6+7,36+7+8)=(6,7)
2. Compute the overall mean:
To compute the overall mean, we take the mean of the class means.
Overall mean=(2+62,3+72)=(4,5)Overall mean=(22+6,23+7)=(4,5)
3. Compute the difference between each class mean and the overall mean:
For Class 1:
Difference=𝜇1−Overall mean=(2−4,3−5)=(−2,−2)Difference=μ1−Overall mean=(2−4,3−
5)=(−2,−2)
For Class 2:
Difference=𝜇2−Overall mean=(6−4,7−5)=(2,2)Difference=μ2−Overall mean=(6−4,7−5)=
(2,2)
4. Compute the between-class scatter matrix:
The between-class scatter matrix is the sum of the outer product of each difference vector with
itself, weighted by the number of samples in each class.
For Class 1:
Difference1⋅Difference1𝑇=(−2−2)⋅(−2−2)=(4444)Difference1⋅Difference1T=(−2−2)⋅(−2
−2)=(4444)
Since there are 3 samples in Class 1, the contribution of Class 1 to 𝑆𝐵SB is
3×(4444)=(12121212)3×(4444)=(12121212)
For Class 2:
Difference2⋅Difference2𝑇=(22)⋅(22)=(4444)Difference2⋅Difference2T=(22)⋅(22)=(4444)
Since there are 3 samples in Class 2, the contribution of Class 2 to 𝑆𝐵SB is
3×(4444)=(12121212)3×(4444)=(12121212)
Finally, sum these contributions:
𝑆𝐵=(12121212)+(12121212)=(24242424)SB=(12121212)+(12121212)=(24242424)

So, the between-class scatter matrix 𝑆𝐵SB for class1 and class2 is:

𝑆𝐵=(24242424)SB=(24242424)

You
how to compute within class scatter matrix for class1 <- matrix(c(1, 2, 2, 3, 3, 4), ncol = 2, byrow =
TRUE) class2 <- matrix(c(5, 6, 6, 7, 7, 8), ncol = 2, byrow = TRUE) manually without r code
ChatGPT
ChatGPT
To compute the within-class scatter matrix manually, we need to find the spread of the data
within each class. Here's how to do it step by step for the given classes class1 and class2:

1. Compute the mean for each class:


For Class 1:
𝜇1=(1+2+33,2+3+43)=(2,3)μ1=(31+2+3,32+3+4)=(2,3)
For Class 2:
𝜇2=(5+6+73,6+7+83)=(6,7)μ2=(35+6+7,36+7+8)=(6,7)
2. Compute the scatter matrix for each class:
For Class 1:
𝑆1=∑𝑖=13(𝑥𝑖(1)−𝜇1)(𝑥𝑖(1)−𝜇1)𝑇S1=i=1∑3(xi(1)−μ1)(xi(1)−μ1)T
For each sample in Class 1:
 Sample 1: (1,2)(1,2)
Difference=(1−2,2−3)=(−1,−1)Difference=(1−2,2−3)=(−1,−1)
Outer product=(−1−1)⋅(−1−1)=(1111)Outer product=(−1−1)⋅(−1−1)=(1111)
 Sample 2: (2,3)(2,3)
Difference=(2−2,3−3)=(0,0)Difference=(2−2,3−3)=(0,0)
Outer product=(00)⋅(00)=(0000)Outer product=(00)⋅(00)=(0000)
 Sample 3: (3,4)(3,4)
Difference=(3−2,4−3)=(1,1)Difference=(3−2,4−3)=(1,1)
Outer product=(11)⋅(11)=(1111)Outer product=(11)⋅(11)=(1111)
Add the outer products:
𝑆1=(1111)+(0000)+(1111)=(2222)S1=(1111)+(0000)+(1111)=(2222)
For Class 2, you will perform similar computations to get 𝑆2S2.
3. Sum the scatter matrices:
𝑆𝑊=𝑆1+𝑆2SW=S1+S2
𝑆𝑊=(2222)+(2222)=(4444)SW=(2222)+(2222)=(4444)

So, the within-class scatter matrix 𝑆𝑊SW for class1 and class2 is:

𝑆𝑊=(4444)SW=(4444)

Creating a PowerPoint presentation on the Fisher Linear Discriminant Function sounds like a great
idea. Here's a suggested outline for your presentation:

Slide 1: Title Slide

 Title: Understanding Fisher Linear Discriminant Function

 Subtitle: A Powerful Tool for Classification

 Your Name/Institution

 Date

Slide 2: Introduction to Classification

 Definition of Classification

 Importance of Classification in Data Analysis

Slide 3: Overview of Fisher Linear Discriminant Function (FLD)

 What is Fisher Linear Discriminant Function?


 Brief History and Background

 Purpose: Discriminant Analysis for Classification

Slide 4: Understanding Linear Discriminant Analysis (LDA)

 Relationship between FLD and LDA

 Applications of LDA in Data Science

Slide 5: Mathematical Foundation

 Formulation of FLD

 Objective Function: Maximizing Between-Class Variance and Minimizing Within-Class


Variance

Slide 6: Assumptions of FLD

 Assumptions in FLD

 Implications of Assumptions

Slide 7: Steps in Fisher Linear Discriminant Analysis

1. Data Preprocessing

2. Computation of Class Means and Scatter Matrices

3. Solving Eigenvalue Problem

4. Selecting Discriminant Functions

5. Classification using FLD

Slide 8: Example Application

 Illustrative Example: Classification of Iris Flowers

 Input Features and Classes

 Visualization of Data

 Results and Performance Metrics

Slide 9: Advantages of Fisher Linear Discriminant Function

 High Computational Efficiency

 Dimensionality Reduction

 Robustness to Noise

Slide 10: Limitations and Challenges

 Sensitivity to Outliers

 Assumption of Gaussian Distribution

 Curse of Dimensionality
Slide 11: Comparison with Other Classification Techniques

 FLD vs. Logistic Regression

 FLD vs. Support Vector Machines

 FLD vs. Decision Trees

Slide 12: Practical Tips and Best Practices

 Feature Selection and Dimensionality Reduction

 Handling Imbalanced Classes

 Cross-Validation and Model Evaluation

Slide 13: Conclusion

 Summary of Key Points

 Importance of FLD in Classification Tasks

Slide 14: References

 List of References and Further Reading

Slide 15: Q&A

 Open Floor for Questions and Discussio

You might also like