I am currently studing exam questions but stuck on this one, I hope someone can help me out to understand.
Question: Assume that we have a paged virtual memory with a page size of 4Ki byte. Assume that each process has four segments (for example: code, data, stack, extra) and that these can be of arbitrary but given size. How much will the operating system loose in internal fragmentation?
The answer is: Each segment will in average give rise to 2Ki byte of fragmentation. This will in average mean 8 Ki byte per process. If we for example have 100 processes this is a total loss of 800 Ki byte.
My question:
- How the answer get the 2Ki byte of fragmentation for each segement, how is that possible we can calculate the size, am I missing something here?
- If we have 8Ki byte per process, that would not even fit in a 4Ki byte page isn't that actually a external fragmentation?
This is academic BS designed to make things confusing.
They are saying probability wise, the last page in the sections in the executable file will only use 1/2 the page size on average. You can't count that size, they are just doing simple combinatorics. That presumes behavior of the linker.