I am trying to find a way to measure the user's speed that he/she is moving. I understand that you can use the GPS Geolocation to watch the position of the user's device at certain intervals:
t=0: (x0,y0)
t=1: (x1,y1)
where x,y
are the GPS coordinates, and then simply apply pythagoras theorem to calculate the distance and use that to calculate the speed.
However, the GPS measurement might not be that accurate when the user is moving a slow speeds (for example running), because the bias as a fraction of the total distance covered per interval is larger due to the inaccuracy implied by the User's GPS sensor.
How can we then get a more accurate measure of the movements when moving at slow speeds?