Manhattan Distance in KNN (Super Simple Guide)
๐ Table of Contents
- What is KNN?
- Why Distance Matters
- What is Manhattan Distance?
- Formula
- Step-by-Step Example
- Manhattan Distance in KNN
- Code Example
- CLI Output
- When to Use It
- Key Takeaways
- Related Articles
๐ What is KNN?
K-Nearest Neighbors (KNN) is a simple machine learning algorithm.
If most nearby points belong to Class A → new point is also Class A.
๐ Why Distance Matters
KNN works completely based on distance.
Different distance methods can give different results.
๐ What is Manhattan Distance?
Manhattan Distance measures distance by moving only in straight lines.
No diagonal movement allowed.
Think of moving in a city:
- Go left/right
- Go up/down
- No shortcuts
๐ Formula
Distance = sum of absolute differences
Distance = |x2 - x1| + |y2 - y1|
Distance should always be positive.
Absolute value removes negative signs.
๐ Step-by-Step Example
Point A = (3, 5) Point B = (1, 9)
- |3 - 1| = 2
- |5 - 9| = 4
- Total = 2 + 4 = 6
๐ค Manhattan Distance in KNN
Now let’s use this inside KNN.
New Point C = (2, 7)
- X (1,5) → Class 1
- Y (3,8) → Class 2
- Z (4,6) → Class 1
C → X = 3 C → Y = 2 C → Z = 3
Closest = Y
If k = 2 → choose 2 closest → majority wins.
๐ป Code Example
from sklearn.neighbors import KNeighborsClassifier X = [[1,5],[3,8],[4,6]] y = [1,2,1] model = KNeighborsClassifier(n_neighbors=2, metric='manhattan') model.fit(X, y) print(model.predict([[2,7]]))
๐ฅ CLI Output
[1]
Prediction = Class 1
๐ When to Use Manhattan Distance
- Grid-based movement (maps, city routes)
- When diagonal movement doesn’t make sense
- High-dimensional data (sometimes better than Euclidean)
๐ฏ Key Takeaways
๐ Related Articles
๐ Final Thought
Manhattan Distance is simple but powerful. It teaches an important idea: “How you measure distance changes your result.”