I have a containerised app that writes two categories of information to STDOUT as single line JSON. The first category is diagnostic logs (ERR/WARN etc) and the second is content access logs (CONTENT_ID/USER_ID etc). I intend to run a Filebeats sidecar to harvest this output to ELS. I want to get the diagnostic log items in one index and the content access logs in another.
Can I run Filebeat with multiple inputs that point to the same file and configure the include lines on each input to pick up the different lines and output them to the correct indexes? Is there a better way to do this? Or I should not try to do this at all and perhaps I should have the containerised app explicitly write the content logs to a separate file?
You could do this with Logstash, but it is probably an overkill and not necessary.
Can you use the
dissect
processor in Filebeat to break something out of the different log lines to differentiate between them? Use conditions andadd_labels
as needed to set some field with the intended index name.Then you can use that in the Elasticsearch output for the index to write to different indices (example with
{[fields.log_type]}
in the docs).