Loading a large hprof into jhat

29.7k views Asked by At

I have a 6.5GB Hprof file that was dumped by a 64-bit JVM using the -XX:-HeapDumpOnOutOfMemoryError option. I have it sitting on a 16GB 64-bit machine, and am trying to get it into jhat, but it keeps running out of memory. I have tried passing in jvm args for minimum settings, but it rejects any minimum, and seems to run out of memory before hitting the maximum.

It seems kind of silly that a jvm running out of memory dumps a heap so large that it can't be loaded on a box with twice as much ram. Are there any ways of getting this running, or possibly amortizing the analysis?

4

There are 4 answers

2
broschb On BEST ANSWER

I would take a look at the eclipse memory analyzer. This tool is great, and I have looked at several Gig heaps w/ this tool. The nice thing about the tool is it creates indexes on the dump so it is not all in memory at once.

0
Kevin On

What flags are you passing to jhat? Make sure that you're in 64-bit mode and you're setting the heap size large enough.

0
Joel Hoff On

Use the equivalent of jhat -J-d64 -J-mx16g myheap.hprof as a command to launch jhat, i.e., this will start jhat in 64-bit mode with a maximum heap size of 16 gigabytes.

If the JVM on your platform defaults to 64-bit-mode operation, then the -J-d64 option should be unnecessary.

1
vimil On

I had to load a 11 GB hprof file and couldn't with eclipse memory analyzer. What I ended up doing was to write a program to reduce the size of the hprof file by randomly removing instance information. Once I got the size of the hprof file down to 1GB, I could open it with eclipse memory analyzer and get a clue on what was causing the memory leak.