I have written a java program which tries to assign 2/3 of heap memory to a byte array and followed by assigning the remaining 1/3 of heap to the another byte array. I have two versions out of which one throws OutOfMemory Error and the other one not. Below are my programs.

Version1 - Which throws OutOfMemory Error

public class MemoryTest
{
    public static void main(String[] args)
    {
        System.out.println( Runtime.getRuntime().totalMemory() );
        {
            byte[] b1 = new byte[(int) (Runtime.getRuntime().maxMemory() * 0.6)];
        }


        byte[] b2 = new byte[(int) (Runtime.getRuntime().maxMemory() * 0.3)];

        System.out.println( Runtime.getRuntime().totalMemory() );
    }
}

Version 2 - Which works fine

public class MemoryTest
{
    public static void main(String[] args)
    {
        System.out.println( Runtime.getRuntime().totalMemory() );
        {
            byte[] b1 = new byte[(int) (Runtime.getRuntime().maxMemory() * 0.6)];
        }

        int i=0; //create and initialize which resolves the OutOfMemory issue. No idea how this makes a difference.

        byte[] b2 = new byte[(int) (Runtime.getRuntime().maxMemory() * 0.3)];

        System.out.println( Runtime.getRuntime().totalMemory() );
    }
}

I have monitored the gc and it is clearing the b1 memory during the initialization of b2 in the case of version 2. But the same is not happening with version 1.

I am not able to get to know how the initialization of an int makes a difference? Can someone explain why the version 1 throws OutOfMemory error and why version 2 not?

OS: Windows 7

Java Version: 1.8

0

There are 0 answers