Subject: Python using compass sensor input: Stuck on calculation
I’m taking a compass reading in degrees named “start”, turning about 360 degrees and taking a reading “end.” I want to know delta between start and end. Not how many degrees turned, but how different end is from start.
degrees_start = 0
degrees_end = 359
#degrees_diff = degrees_end - degrees_start
degrees_diff = (degrees_start-degrees_end) % 360
print(degrees_diff)
'''
test set
degrees_start, degrees_end, deg_diff(expected), deg_diff(observed)
10, 20, +10, +10
20, 10, -10, -10
350, 10, +20, -340
10, 350, -20, +340
0, 359, -1, +359
359, 0, +1, -359
'''
Algorithm is pretty easy: end – start = delta
But I’m stuck at the boundary at 359 and 1. Example start 10, end 350. Or start 350 end 10. I’ve tried many arithmetic combinations but not come up with a formula that is always correct.
Any suggestions? Thanks.
Tests of some answers below:
# test 10,350 -> correct answer -> -20 i.e. 20 deg short of full circle
#degrees_diff = degrees_end - degrees_start # test 10,350 -> 340
#degrees_diff = (degrees_start-degrees_end) % 360 # test 10,350 -> 20
#degrees_diff = (degrees_end - degrees_start) % 360 # test 10,350 -> 340
The answer depends on the direction of rotation. If you always want the shortest angle between end and start, then you'll have to calculate both directions and compare.
Also, there are two ways to think about direction: (1) the direction of the second hand on a clock or compass angles where zero falls on the +y axis and 90 deg falls on the +x axis. Increasing angles go counterclockwise (2) a traditional mathematical definition of 0 angle falling on +x axis, increasing to +90 degrees is +y axis, etc. Increase angles go clockwise.
I'll use the first definition/convention.
which gives output: