I have created two very similar pipelines:
- pipeline.id: ERP_PROD
pipeline.workers: 1
queue.type: persisted
path.config: "/etc/logstash/conf.d/ERP_PROD.conf"
- pipeline.id: ERP_DEV
pipeline.workers: 1
queue.type: persisted
path.config: "/etc/logstash/conf.d/ERP_DEV.conf"
he .conf files differ only in two parameters input.path and filter.translate.dictionary_path one looks like this:
input {
file {
codec => multiline {
pattern => "^{%{DATESTAMP_EVENTLOG}"
what => "previous"
negate => true
}
path => "/var/data/1CLog/ERP_PROD/1Cv8Log/*.lgp"
ignore_older => "432000"
sincedb_path => "/etc/logstash/file_input_tracking/ERP_PROD.sincedb"
start_position => "beginning"
stat_interval => 120
type => "ERP_PROD"
}
}
filter {
mutate {
remove_field => [ "host", "@version", "@timestamp" ]
rename => [ "type", "Database" ]
}
grok {
match => [ "message", '{%{DATESTAMP_EVENTLOG:Date},%{DATA:StatusTransaction},{%{DATA:Transaction},%{DATA:NumberTransaction}},%{INT:UserId},%{INT:ComputerId},%{INT:NameApplicationId},%{INT:Connection},%{INT:EventId},%{DATA:Importance},\"%{DATA:Comment}\",%{INT:MetadataId},{\"%{WORD:ArrayDataType}\",\"%{DATA:Data1}\"},\"%{DATA:RepresentationData}\",%{INT:WorkServerId},%{INT:MainIpPort},%{INT:SecondIpPort},%{INT:Session},%{INT:MoreMetadata}' ]
}
translate {
source => "[UserId]"
target => "[User]"
dictionary_path => "/etc/logstash/conf.d/ERP_PROD_.yml"
refresh_interval => 104
fallback => "Nothing to match!"
}
}
output {
opensearch {
hosts => ["https://localhost:9200"]
user => "logstash"
password => "**********"
index => "log-%{+YYYY.MM.dd}"
}
}
I'm afraid that the filter may trigger on both data at the same time, because I don't check input.type.
Can you tell me if I need to add input.type checking or if the different pipelines are independent and don't know anything about each other?