In k-means or kNN, we use euclidean distance to calculate the distance between nearest neighbours. Why not manhattan distance ?

**Contents**hide

#### Best Answer

No, KNN is generic and you can use any valid metric you want. For example, cosine distance is another metric that is used frequently. Here is an implementation in `scikit-learn`

where you can choose among several distance options. You can also define your own metric to use.

K-means is slightly different. It really uses Euclidean distance, and it becomes a harder problem for generic metrics. However, k-medians is a variation of k-means with L1 distance. it has variations such as k-medians (using L1 distance). k-medoids is another generalization of this algorithm where the cluster center is chosen among the data points and you can use any metric with it.

### Similar Posts:

- Solved – In k-means or kNN, we use euclidean distance to calculate the distance between nearest neighbours. Why not manhattan distance
- Solved – Why is k-medians typically used with Manhattan rather than Euclidean distance
- Solved – Why is k-medians typically used with Manhattan rather than Euclidean distance
- Solved – Why is the Manhattan distance (or block distance) appropriate when I have a discrete data set
- Solved – Why is the Manhattan distance (or block distance) appropriate when I have a discrete data set