Logstash. Get fields by position number

1.4k views Asked by At

Background

I have the scheme: logs from my app go through rsyslog to central log server, then to Logstash and Elasticsearch. Logs from app is a pure JSON, but rsyslog adds to log "timestamp", "app name" and "server name" fileds. And log becomes to this:

timestamp app-name server-name [JSON]

Question

How can I remove first three fields with Logstash filters? Can I get fields by position numbers (like in awk) and do something like:

filter {
  somefilter_name {
      remove_field => $1, $2, $3 
  }
}

Or maybe my vision is totally wrong and I must do this in another way?

Thank you!

1

There are 1 answers

3
Alain Collins On BEST ANSWER

Use grok{} to match them (they may be useful on their own!) and put the remainder of the event back into the [message] field:

Given input like:

2015-06-16 13:37:30 myApp myServer { "jsonField": "jsonValue" }

And this config:

grok {
     pattern => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:app} %{WORD:server} %{GREEDYDATA:message}"
     overwrite => [ "message" ]
}

json {
    source => "message"
}

Will produce this document:

{
   "message" => "{ \"jsonField\": \"jsonValue\" }",
   "@version" => "1",
   "@timestamp" => "2015-06-16T20:38:55.658Z",
   "host" => "0.0.0.0",
   "timestamp" => "2015-06-16 13:37:30",
   "app" => "myApp",
   "server" => "myServer",
   "jsonField" => "jsonValue"
}