Hello people good night!
I had never touched the elk stack and now I have a mission to get the csv in a bucket with the input index of s3 and send the output to elasticsearch.
I already configured it and it worked properly for a file in /etc/logstach/conf.d/example1.conf
I'll just leave the input here
The content of example1.conf is:
input {
s3 {
bucket => "my-bucket-s3"
region => "sa-east-1"
id => "myotherid"
prefix => ""
role_arn => "myrole"
type => "s3"
sincedb_path => "/dev/null"
codec => "plain"
interval => 10
}
}
Now I created another file in the same directory example2.conf
The content of example2.conf is:
input {
s3 {
bucket => "my-bucket-s3"
prefix => "securityhub-results/myarchive.csv"
region => "sa-east-1"
type => "s3"
id => "myid-SecHub"
role_arn => "myrole"
sincedb_path => "/dev/null"
codec => "plain"
interval => 30
}
}
filter {
csv {
separator => ","
columns => ["CheckID", "TotalChecks", "PassedChecks", "FailedChecks", "Score"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "index-aws_security_hub"
user => "myuser"
password => 'mypass'
document_id => "%{CheckID}"
manage_template => false
ilm_enabled => false
codec => "json"
}
stdout {
codec => rubydebug
}
}
The index is being created properly but it is not taking the contents of myarchive.csv but the contents of the other csv in the root of the bucket and this does not make sense.
Initially this bucket only has these 2 objects.
Before, myarquive.csv was also in the root, but I wasn't able to filter by it. I decided to create a folder to make it easier to use the prefix, but it still didn't work.
I don't know if the files being inside logstash/conf.d could be causing any conflict.
Already tried use: prefix => "securityhub-results/myarchive.csv" prefix => "/securityhub-results/myarchive.csv"
I read the docs of elastic: https://www.elastic.co/guide/en/logstash/7.17/plugins-inputs-s3.html#plugins-inputs-s3-prefix
I also did other research on the internet but there were other situations