I have a fluentd client which forwards logs to logstash and finally gets viewed through Kibana. I've tried several configurations to update timestamp to the time entry from log files. However, no luck in doing so.
Here is my parse section of fluentd config file,
<parse>
# @type none
@type multiline
format_firstline /^\[([\w]+)\s*\]\s*(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})/
format1 /\[(?<loglevel>[\w]+)\s*\]\s*(?<logtime>\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})\s*\[(?<thread>[^\]]*)\]\s*(?<class>[^ ]*)\s+-\s*(?<message>.*)/
time_key logtime
time_type string
time_format %Y-%m-%d %H:%M:%S
keep_time_key true
</parse>
Why is this configuration not working? The timestamp in kibana is still fluent_time and not my logtime.
I have also tried adding time_key in the 'inject' section within 'match' as shown below, this didn't work either. Reference - How to add timestamp & key for elasticsearch using fluent
<inject>
time_key @logtime
time_type string
time_format %Y-%m-%d %H:%M:%S
</inject>
Apart from that I've also tried record_transformer as given below, that didn't work either. Reference - https://docs.fluentd.org/v0.12/articles/filter_record_transformer#renew_time_key-(optional,-string-type)
<filter pattern>
@type record_transformer
<record>
renew_time_key ${record["logtime"]}
</record>
</filter>
I've also tried with and without '@' for logtime in each attempt. I've also tried adding the time_key components within the 'source' section as well, that didn't work either.
Fluentd and ruby version - fluentd-1.0.2 ruby="2.4.2"
At this point, I'm stumped. What am I missing? Any suggestion is appreciated!
Honestly, not sure how this resolved the issue, but, adding the below inject tag along with my time_key configurations in parse seems to have done the trick. Timestamp is based on logtime now in Kibana.