The java process is called as:
/usr/local/openjdk-8/bin/java -XX:+UseG1GC -Dlog4j.debug
-Dlog4j.configuration=log4j.properties
-Djava.util.logging.config.file=/etc/metrics/conf/logging.properties
-javaagent:/prometheus/jmx_prometheus_javaagent.jar=8090:/etc/metrics/conf/prometheus.yaml
-Dspark.driver.port=7078 -Dspark.driver.blockManager.port=7079 -Xms1g -Xmx1g
-cp /etc/hadoop/conf::/opt/spark/jars/*:/etc/hadoop/conf:/opt/hadoop/share/hadoop/common/lib/*:/opt/hadoop/share/hadoop/common/*:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/*:/opt/hadoop/share/hadoop/hdfs/*:/opt/hadoop/share/hadoop/mapreduce/lib/*:/opt/hadoop/share/hadoop/mapreduce/*:/opt/hadoop/share/hadoop/yarn:/opt/hadoop/share/hadoop/yarn/lib/*:/opt/hadoop/share/hadoop/yarn/*:/etc/hadoop/conf:/opt/hadoop/share/hadoop/common/lib/*:/opt/hadoop/share/hadoop/common/*:/opt/hadoop/share/hadoop/hdfs:/opt/hadoop/share/hadoop/hdfs/lib/*:/opt/hadoop/share/hadoop/hdfs/*:/opt/hadoop/share/hadoop/mapreduce/lib/*:/opt/hadoop/share/hadoop/mapreduce/*:/opt/hadoop/share/hadoop/yarn:/opt/hadoop/share/hadoop/yarn/lib/*:/opt/hadoop/share/hadoop/yarn/* org.apache.spark.executor.CoarseGrainedExecutorBackend
--driver-url spark://CoarseGrainedScheduler@realtimesummariesjob-f1c1217b63219e7f-driver-svc.spark-apps.svc:7078
--executor-id 1 --cores 1 --app-id spark-fcab57dc38254481a12bdd9dc79b7a98 --hostname 10.244.1.23
so it has prometheus JMX-exporter to be set up as javaagent. And logging is set up according to the https://github.com/prometheus/jmx_exporter#debugging
However, no one from prometheus jmx-exporter log is observed. JMX-exporter itself works correctly and outputs the correctly formatted data on the exposed port.
What is the way to provide the logging for the code that is executed as javaagent?