Grok parse error when using custom pattern definitions

1.5k views Asked by At

I'm trying to use a grok filter in logstash version 1.5.0 to parse several fields of data from a log file.

I'm able to parse a simple WORD field with no issues, but when I try to define a custom pattern and add that in as well, the grok parse fails.

I've tried using a couple of grok debuggers which have been recommended elsewhere to find an issue:

http://grokconstructor.appspot.com/do/match

and

http://grokdebug.herokuapp.com/

both say that my regex should be fine, and return the fields that I want, but when I add it to my logstash.conf, grok fails to parse the log line and simply passes through the raw data to elasticsearch.

My sample line is as follows:

APPERR [2015/06/10 11:28:56.602] C1P1405 S39 (VPTestSlave002_001)| 8000B Connect to CGDialler DB (VPTest - START)| {39/A612-89A0-A598/60B9-1917-B094/9E98F46E} Failed to get DB connection: SQLConnect failed. 08001 (17) [Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or access denied.

My logstash.conf grok config looks like this:

    grok
    {
        patterns_dir => ["D:\rt\Logstash-1.5.0\bin\patterns"]
        match => {"message" => "%{WORD:LogLevel} \[%{KERNELTIMESTAMP:TimeStamp}\]"}
    }

and the contents of my custom pattern file are:

KERNELTIMESTAMP %{YEAR}/%{MONTHNUM}/%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?

I am expecting this to return the following set of data:

{
  "LogLevel": [
    [
      "APPERR"
    ]
  ],
  "TimeStamp": [
    [
      "2015/06/10 11:28:56.602"
    ]
  ],
  "YEAR": [
    [
      "2015"
    ]
  ],
  "MONTHNUM": [
    [
      "06"
    ]
  ],
  "MONTHDAY": [
    [
      "10"
    ]
  ],
  "HOUR": [
    [
      "11",
      null
    ]
  ],
  "MINUTE": [
    [
      "28",
      null
    ]
  ],
  "SECOND": [
    [
      "56.602"
    ]
  ],
  "ISO8601_TIMEZONE": [
    [
      null
    ]
  ]
}

Can anyone tell me where my issue is?

0

There are 0 answers