Getting distance between two points based on latitude/longitude
I tried implementing this formula: http://andrew.hedges.name/experiments/haversine/ The aplet does good for the two points I am testing:
Yet my code is not working.
from math import sin, cos, sqrt, atan2 R = 6373.0 lat1 = 52.2296756 lon1 = 21.0122287 lat2 = 52.406374 lon2 = 16.9251681 dlon = lon2 - lon1 dlat = lat2 - lat1 a = (sin(dlat/2))**2 + cos(lat1) * cos(lat2) * (sin(dlon/2))**2 c = 2 * atan2(sqrt(a), sqrt(1-a)) distance = R * c print "Result", distance print "Should be", 278.546
The distance it returns is 5447.05546147. Why?
It's because in Python, all the trig functions use radians, not degrees.
You can either convert the numbers manually to radians, or use the
radians function from the math module:
from math import sin, cos, sqrt, atan2, radians # approximate radius of earth in km R = 6373.0 lat1 = radians(52.2296756) lon1 = radians(21.0122287) lat2 = radians(52.406374) lon2 = radians(16.9251681) dlon = lon2 - lon1 dlat = lat2 - lat1 a = sin(dlat / 2)**2 + cos(lat1) * cos(lat2) * sin(dlon / 2)**2 c = 2 * atan2(sqrt(a), sqrt(1 - a)) distance = R * c print("Result:", distance) print("Should be:", 278.546, "km")
The distance is now returning the correct value of
Edit: Just as a note, if you've stumbled across this post because you just need a quick and easy way of finding the distance between two points, I recommend you use the approach recommended in Kurt's answer below -- see his post for rationale.
★ Back to homepage or read more recommendations: