How to avoid OutOfMemoryErrors when processing big analysis reports?

164 views Asked by At

Running SonarQube 5.6.6 from Jenkins on CentOS 7.3, I got the following error:

2017.09.01 19:05:16 ERROR [o.s.s.c.t.CeWorkerCallableImpl] Failed to execute task AV485bp0qXlQ-QPWWE9A
java.lang.OutOfMemoryError: Java heap space
2017.09.01 19:05:17 ERROR [o.s.s.c.t.CeWorkerCallableImpl] Executed task | project=PP::Symphony3M | type=REPORT | id=AV485bp0qXlQ-QPWWE9A | time=74089ms

sonar.ce.javaOpts is set like below:

sonar.ce.javaOpts=-Xmx60g -Xms1g -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true

How much heap space should I give to SonarQube, when analyzing a one million LOC project? Or is there another way of avoiding Java heap space issues?

1

There are 1 answers

1
Jeel On

The max heap you can allocate depends on the free Ram on your server. free command can help identify the stats. Based on free Ram you can set your Xmx values.

BTW, make sure the code compiles on a server. If you are able to compile and not scan then only the increasing the heap will help.