We are troubleshooting a performance of a Java application. And recently we have been using Application Performance Diagnostics tool for the application. Which also measures the CPU and Memory (don't know the exact method for it but contact the supplier about it). Besides that we have seen the Windows process/resource monitor with CPU usages. Those are quite different and we don't have any clue.
The Java application (openjdk8) running on a Windows server 2012 machine. When we measure the CPU from the Java App we see peaks from 100% till 500% all the time. While our Virtual Machine/Server has 12 CPU's and Windows resource monitor indicates all the time CPU usage of below 10-15%. I have few questions regarding this unclarity;
1 - Why is there a difference between the CPU measurement between the application and Windows Resource Monitor? 2 - How is it possible that it can burst 500%? 3 - Where can I adjust on Windows OS normally the application allocated CPU so that it has that 500% directly available as CPU resource
One difference between the two tools seems to be what is 100%:
The other difference might be sampling / averaging period: