I was assigned a task to great a log analyzer with elasticsearch. I'm using an ingest pipeline with a grok pattern. But some multiline logs make it impossible to function.
I'm using a ingest pipeline with the following grok pattern to manage the logs %{TIMESTAMP_ISO8601:msg_date} %{DATA:msg_hostname} %{DATA:msg_logger} %{LOGLEVEL:msg_level} %{DATA:msg_process}\[%{INT:msg_process_port}\] %{INT:msg_thread}: %{GREEDYDATA:msg_message}
An these are some logs from the given file:
2023-11-03T13:02:35.3999532Z WMD109618 Service INFO App.Service[6348] 1: App.Service started
2023-11-03T13:02:45.4906583Z WMD109618 Service INFO App.Service[6348] 3: User logged in
2023-11-03T13:02:50.8901594Z WMD109618 Scheduler INFO App.Service[6348] 16: Modify task 847d18d3-a719-4fbf-9we1-93cf363bdd7c, type=Update
2023-11-03T13:02:50.9799602Z WMD109618 Scheduler INFO App.Service[6348] 16:
FunkVersorgungOk
Cron expression :
03/11/2023 14:02:50 +01:00
03/11/2023 14:02:51 +01:00
03/11/2023 14:02:52 +01:00
03/11/2023 14:02:53 +01:00
03/11/2023 14:02:54 +01:00
03/11/2023 14:02:55 +01:00
03/11/2023 14:02:56 +01:00
03/11/2023 14:02:57 +01:00
03/11/2023 14:02:58 +01:00
03/11/2023 14:02:59 +01:00
03/11/2023 14:03:00 +01:00
03/11/2023 14:03:01 +01:00
03/11/2023 14:03:02 +01:00
2023-11-03T13:02:51.0250407Z WMD109618 Scheduler INFO App.Service[6348] 13: Send scheduled task FunkVersorgungOk
2023-11-03T13:02:51.0504035Z WMD109618 ChannelManager INFO App.Service[6348] 13: Type:Alpha|Address:0004504(a)A|PlainText:
So far every log was parsed without any further problems. Except the one with all the additional time stamps. I tried multiple ways of changing my grok pattern or adding other processes but nothing helps to manage this. Is there a way to get the all the additional timestamps in the same field with only using kibana, elasticsearch and filebeat?