I'm setting up a small program to take 2 geographical coordinates from a user and then calculate the distance between them(taking into account the curvature of the earth). So I looked up wikipedia on what the formula is here.
I basically set up my python function based on that and this is what I came up with:
def geocalc(start_lat, start_long, end_lat, end_long):
start_lat = math.radians(start_lat)
start_long = math.radians(start_long)
end_lat = math.radians(end_long)
end_long = math.radians(end_long)
d_lat = start_lat - end_lat
d_long = start_long - end_long
EARTH_R = 6372.8
c = math.atan((math.sqrt( (math.cos(end_lat)*d_long)**2 +( (math.cos(start_lat)*math.sin(end_lat)) - (math.sin(start_lat)*math.cos(end_lat)*math.cos(d_long)))**2)) / ((math.sin(start_lat)*math.sin(end_lat)) + (math.cos(start_lat)*math.cos(end_lat)*math.cos(d_long))) )
return EARTH_R*c
The problem is that the results come out really inaccurate. I'm new to python so some help or advice would be greatly appreciated!
You've got 4 or 5 or 6 problems:
(1)
end_lat = math.radians(end_long)
should beend_lat = math.radians(end_lat)
(2) you are missing some stuff as somebody already mentioned, most probably because
(3) your code is illegible (line far too long, redundant parentheses, 17 pointless instances of "math.")
(4) you didn't notice the remark in the Wikipedia article about using
atan2()
(5) You may have been swapping lat and lon when entering your coordinates
(6)
delta(latitude)
is computed unnecessarily; it doesn't appear in the formulaPutting it all together: