I have a TCP client-server application implemented in C. The client and the server are two separate processes and the communication only happens using sockets which are bound to the loopback device (local address 127.0.0.1).
When the communication lasts for a long time (for example either the client or the server is sending many MB of data) I noticed that either the client or the server wait for a packet which never arrives and the timeout is reached. I tried to increase the timeout to 10000 (10 sec) but it didn't help.
I'm wondering why even for processes on the same computer a packet sent through the socket by either the client or the server can be lost.