I'm facing a misunderstanding here.
I have an application based on UDP for Automatic repeat request mechanism. The idea is to transmit real-time data with a mechanism of retransmission and reordering of received packets ( temporary buffer to manage received and recovered packets was implemented).
To test the efficiency of this application I have implemented a function that calculates the one-way packet delay ( to investigate the influence of the retransmission mechanism on the packet delay and average delay).
Now the problem is during the experiments in an ad-hoc network where just nodes are connected ( transmitter and receiver) I've noticed that the average delay for transmitted packets using JUST UDP ( when distance between receiver and transmitter = 60 m using WiFi g) is also increased!
I am wondering, does the delay have to increase when using just UDP in function of distance ? If so, so what is the cause ?
In another way, if I send 100 packet at 10 meters using UDP I'll get delay = 5 ms, and at 60 m I'll get 17 ms for example? Does this make sense?