HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

When should I move beyond k nearest neighbour

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
nearestmoveneighbourshouldwhenbeyond

Problem

For many machine learning projects that we do, we start with the k Nearest Neighbour classifier. This is an ideal starting classifier as we usually have sufficient time to calculate all distances and the number of parameters is limited (k, distance metric and weighting)

However, this has often the effect that we stick with the knn classifier as later in the project there is no room for switching to another classifier. What would be good reason to try a new classifier. Obvious ones are memory and time restraints, but are there cases when another classifier can actually improve the accuracy?

Solution

kNN is useful for large data samples

However it's disadvantages are:

  • Biased by value of k.



  • Computation Complexity



  • Memory Limitation



  • Being a supervised learning lazy algorithm



  • Easily fooled by irrelevant attributes.



  • Prediction accuracy can quickly degrade when number of attributes increase.



It's usually only effective if the training data is large, and training is very fast.

Context

StackExchange Computer Science Q#21977, answer score: 3

Revisions (0)

No revisions yet.