I am using a custom python script to create and send RTP packets to ffmpeg, which then converts it into s16le audio frames that are read from stdout. For this I use
ffmpeg -i rtp://0.0.0.0:9000 -f s16le -ar 8000 -acodec pcm_s16le -ac 1 -loglevel debug -
on the receiving side.
The audio received sounds fine, but the number of packets reported varies.
I get that there might be packet loss with UDP (though every 2nd-5th run for a 50 second file while streaming on the same machine seems high), but
- there are no warning or error messages
- when I intentionally skip (drop) a packet in the python script, I get a message
- when I had not yet worked out the packet timing, I got messages
- when I switch the sequence number for two packets (so that
seq=1001is sent beforeseq=1000), I get a message - tcpdump shows all packets and in the correct order
- the machine has 48 cores and was relatively idle
My question is: Why do I have packet loss while there are no errors/warnings/messages and no issues visible in the network dump?