Heap space error while executing logstash with a large dictionary (translate filter)

9.6k views Asked by At

I get an error

java.lang.OutOfMemoryError: Java heap space while executing logstash with a 
large dictionary of 353 mega bytes in translate filter. 

I use it to do a lookup on my input data.

I tried to allow the JVM to use more memory (with java -Xmx2048m). suppose i do it wrong because it has no effect.

I tested my config file with "smaller" dictionary and it worked fine. Any help please ? how to give logstash enough RAM to not die ?

My config file looks like that :

input {  
file {
  type => "MERGED DATA"
  path => "C:\logstash-1.4.1\bin\..."
  start_position => "beginning"
    sincedb_path => "/dev/null"}} 

 filter {
grok {
    match => [ "message", "..." ]} 

if (...") {
translate {dictionary_path => "C:\logstash-1.4.1\bin\DICTIONARY.yaml"  field => "Contact_ID"  destination => "DATA" fallback => "no match" refresh_interval => 60  }

grok {match => [ "DATA", "..." ]}

mutate {remove_field => ...}

else if ...

else if ...
 
mutate {   ... }
}
output { if [rabbit] == "INFO" {
  elasticsearch {
    host => "localhost"
            }
  stdout {}
}}
3

There are 3 answers

0
morallo On BEST ANSWER

To increase heap size set the LS_HEAP_SIZE environment variable before launching logstash.

LS_HEAP_SIZE=2048m

0
stouch On

I had a very large conf file and this solution worked in my case : https://discuss.elastic.co/t/changing-logstash-heap-size/69542/4

LS_JAVA_OPTS="-Xmx2g -Xms1g" bin/logstash -e ""
1
onlyme On

I was having the similar issue. Mine looks

logstash <Sequel::DatabaseError: Java::JavaLang::OutOfMemoryError: Java heap space 

To solve this, I had to add some settings in my logstash config file. I added the bellow settings in the jdc section

jdbc_paging_enabled => true
jdbc_page_size => 200000

You can have a look on this thread enter link description here