Issue with filebeat 8.12.2 output to logstash and then to elasticsearch with custom index name

17 views Asked by At

I'm experiencing issues setting up a logging pipeline with Filebeat, Logstash, and Elasticsearch (version 8.12.2), following the configurations and examples provided in the elkninja/elastic-stack-docker-part-one GitHub repository. My goal is to configure Filebeat to collect logs, send them to Logstash, and from there, output to Elasticsearch with a custom index name. However, this setup is not functioning as expected.

Here is the relevant part of my filebeat.yml:

logging.level: debug
filebeat.inputs:
  - type: filestream
    id: filestream0
    paths:
      - ingest_data/s0.log

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

processors:
  - add_docker_metadata: ~

setup.kibana:
  host: ${KIBANA_HOSTS}
  username: ${ELASTIC_USER}
  password: ${ELASTIC_PASSWORD}

output.logstash:
  hosts: ${LOGSTASH_HOSTS}
  ssl.enabled: true
  ssl.certificate_authorities: "certs/ca/ca.crt"

And my logstash.conf:

input {
  beats {
    port => 5044
  }
}

filter {
}

output {
  elasticsearch {
    hosts => "${ELASTIC_HOSTS}"
    user => "${ELASTIC_USER}"
    password => "${ELASTIC_PASSWORD}"
    cacert => "certs/ca/ca.crt"
    index => "trenara-%{+YYYY.MM.dd}"
  }
}

Additional context:

  1. I deployed the ELK stack referring to the https://github.com/elkninja/elastic-stack-docker-part-one repo. When Filebeat is configured to output directly to Elasticsearch, everything works as expected.
  2. However, when I change Filebeat's configuration to output to Logstash, it doesn't work. After executing echo "log content" >> s0.log, no logs appear, and there are no indices or data streams visible in Kibana. There are no significant logs from Filebeat, Logstash, or Elasticsearch that indicate any issues.

Are there any obvious misconfigurations in my setup that could be causing this issue? And how to let it works?

0

There are 0 answers