CS8082U4L01 - K-Nearest Neighbour Learning
CS8082U4L01 - K-Nearest Neighbour Learning
CS8082U4L01 - K-Nearest Neighbour Learning
• Understand the need for machine learning for various problem solving
• Study the various supervised, semi-supervised and unsupervised learning
algorithms in machine learning
2
Unit Outcomes
3
Prerequisite
The diagram on the right side of figure shows the shape of this decision
surface induced by 1-NN over the entire instance space.
The decision surface is a combination of convex polyhedra surrounding each
of the training examples.
For every training example, the polyhedron indicates the set of query points
whose classification will be completely determined by that training example.
Query points outside the polyhedron are closer to some other
training example.
This kind of diagram is often called the Voronoi diagram of the set of
training examples.
k- Nearest Neighborhood Learning
The only disadvantage of considering all examples is that our classifier will run
more slowly.
If all training examples are considered when classifying a new query instance,
we call the algorithm a global method.
If only the nearest training examples are considered, we call it a local method.
When the above rule is applied as a global method, using all training examples,
it is known as Shepard's method.
2. Locally weighted regression
TEXT BOOKS:
1. Tom M. Mitchell, ―Machine Learning‖, McGraw-Hill Education (India) Private Limited, 2013.
REFERENCES: