While executing k-NN on a large dataset containing around 1,68,000 records using the inbuilt function available in Python for prediction of continuous values, Euclidean distance takes 2 mins whereas if I use wminkowski distance as a distance metric for computing distances between two instances, k-NN is executed in around 52 minutes. Out of 1,68,000 records, 70% were used for training and 30% were used for testing.
Can anyone please explain the reason behind such a vast time difference? Are there any means by which time complexity of wminkowski can be used?