I have local server running in docker container which is set to use fluentd as a log driver. I have docker compose file which runs fluentd, nginx, elasticsearch and kibana in their own containers. So fluentd takes logs from my server, passes it to the elasticsearch and is displayed on Kibana.
My question is, how to parse my logs in fluentd (elasticsearch or kibana if not possible in fluentd) to make new tags, so I can sort them and have easier navigation.
This is current log displayed in Kibana. Now I want this log string to be 'broken' into new tags. In this case:
2017/01/04 13:26:56.574909 UTC (Example deployment.web) [INFO] [GET] /api/device/ 200 10.562379ms
to
date: 2017/01/04
time: 13:26:56.574909 UTC
message: (Example deployment.web)
logType: [INFO]
other: [GET] /api/device/ 200 10.562379ms
My docker-compose.yml
version: "2"
services:
fluentd:
image: fluent/fluentd:latest
ports:
- "24224:24224"
volumes:
- ./fluentd/etc:/fluentd/etc
command: /fluentd/etc/start.sh
networks:
- lognet
elasticsearch:
image: elasticsearch
ports:
- "9200:9200"
- "9300:9300"
volumes:
- /usr/share/elasticsearch/data:/usr/share/elasticsearch/data
networks:
- lognet
kibana:
image: kibana
restart: always
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_URL=http://localhost:9200
networks:
- lognet
nginx:
image: nginx
ports:
- "8084:80"
logging:
driver: fluentd
networks:
- lognet
networks:
lognet:
driver: bridge
my fluent.conf file, no parsing included, just simple forward
<source>
type forward
</source>
<match *.*>
type elasticsearch
host elasticsearch
logstash_format true
flush_interval 10s
</match>
my try with regexp, here i try to parse logType out
<source>
@type forward
</source>
<match *.*>
type stdout
</match>
<filter docker.**>
@type parser
format /(?<logType>\[([^\)]+)\])/
key_name log
reserve_data false
</filter>
I tried other configurations but none resulted in parsing my logs.
For anyone having similar issue i found a solution that works for me.
In fluent.conf file new filter tags are added. For instance, if I want to make new field called severity the first step is to record it with regex.
Example is [DEBU].
And is afterwards deleted from original message:
The main part is:
Where severity is name of the new field, record["log"] is original log string where string via regex is found and appended to the new field.
This command modifies field log where regex is substitued by empty string - deleted.
NOTE: Order is important since we first have to append to the new field and then delete string from the original log message (if needed).