Configuration ELK + log4j

12.9k views Asked by At

I installed ELK on a ubuntu server 14.04. And now I wanted to send to this all my jboss sever logs (using log4j).

logstash configuration : input conf file :

input {
    log4j {
        type => "log4j"
        port => 5000
    }
}

filter conf file :

filter {
    if [type] == "log4j" {
        grok {
            match => {"message" => MY_GROK_PARSE}
        }
    }
}

and the output file :

output {
    elasticsearch {
        embedded => true
    }
}

And to finish the log4j appender:

<appender name="LOGSTASH" class="org.apache.log4j.net.SocketAppender"> 
    <param name="Port" value="5000"/>
    <param name="RemoteHost" value="XXX.XXX.XXX.XXX"/> <!-- There is a real adress here ;-) -->
    <param name="ReconnectionDelay" value="50000"/> 
    <param name="LocationInfo" value="true"/> 
    <layout class="org.apache.log4j.PatternLayout">
     <param name="ConversionPattern" value="%d %-5p [%c{1}] %m%n" />
    </layout>
</appender> 

But nothing happens with this configuration. So I don't know what I misunderstand. My other appenders (console and local file) work fine. The elasticsearch log show any information/activity.

Edit : More about my jboss-log4j.xml:

<appender name="Async" class="org.apache.log4j.AsyncAppender">
    <appender-ref ref="FILE" />
    <appender-ref ref="CONSOLE" />
    <appender-ref ref="LOGSTASH" />
</appender>

<root>
    <priority value="INFO" />
    <appender-ref ref="Async" />
</root>
2

There are 2 answers

0
dimethyl On

I know it's an old post, but someone may find it useful - log4j SocketAppender can't use layout, see docs for SocketAppender

SocketAppenders do not use a layout. They ship a serialized LoggingEvent object to the server side.

You also don't need additional filter in logstash configuration. Logstash log4j plugin minimal configuration is sufficient

input {
   log4j {
      data_timeout => 5
      host => "0.0.0.0"
      mode => "server"
      port => 4560
      debug => true
      type => "log4j"
   }
   ... 
}
0
Marcelo Grossi On

You can send it directly to Elastic in this case. No reasons to go through LogStash first. You can easily use a filter to filter out messages you're not interested in.

I've written this appender here Log4J2 Elastic REST Appender if you want to use it. It has the ability to buffer log events based on time and/or number of events before sending it to Elastic (using the _bulk API so that it sends it all in one go). It has been published to Maven Central so it's pretty straight forward.