logstash - Jdbc input plugin doesn’t work with prepared statements enabled and with big data

889 views Asked by At

I've a task to import data from Oracle database to Elasticsearch. Of course I'm using JDBC input plugin for it.

Due to performance reasons I need to enable prepared statements for a plugin. (It will reduce read operations on DB and will proper index usage)

My configuration looks as follows:

input {
  jdbc {
    jdbc_fetch_size => 999
    schedule => "* * * * *"
    use_prepared_statements => true
    prepared_statement_name => "foo"
    prepared_statement_bind_values => [":sql_last_value"]
    statement => " SELECT
      ......
      FROM table_name tbl
      JOIN ......
      JOIN ...
      LEFT JOIN ......
      WHERE tbl.id > ?
    "
    use_column_value => true
    tracking_column => "id"
  }
}

But here I hit a problem. After activating it:

  • no events are transmitted in logstash
  • no new documents in ELK are created
  • CPU usage and memory consumptions is 100%.
  • after some time logstash scrash with following error: java.lang.BootstrapMethodError: call site initialization exception

Few important remarks:

  • it doesn't matter if I change jdbc_fetch_size to smaller or larger value (it only affects how fast memory will be consumed)
  • on smaller amount of data everything works fine - documents in ELK indexes are created but with slight delay which doesn't occur when prepared statement are disabled.
  • when prepared statement are disabled everything works fine even with large data and without any delay

I've tested it on two versions 7.8.0 and 7.9.0 - results on both is the same - not working.

Am I dealing here with a bug?

2

There are 2 answers

0
e1m7bo On

What value are you using for jdbc_driver_class?

Java::oracle.jdbc.driver.OracleDriver (driver itself is oracle:oracle:11.2.0.4-jdk6-sp1)

What happens if you execute the query directly against the DB?

It is working (long, because of number of entries returned) but it didn't crash and statements were constantly displayed (checked on oracle console).

How many records is that sql statement returning from the DB?

193 millions of records

Have you tried to return only a batch of rows from the DB

Yes, with limited records it works but it is not an option for me - I need to fetch all records and not only part of it (using rownum doesn't ensure it)

Here problem with prepared statements is fact that it crash, I do not complain about long lasting import (I'm prepared for it - large number of records enforced it)

0
Luis Goncalves On

What value are you using for jdbc_driver_class?

What happens if you execute the query directly against the DB?

How many records is that sql statement returning from the DB? Have you tried to return only a batch of rows from the DB? For that you can change your SQL statement to:

select * from (
 SELECT
      ......
      FROM table_name tbl
      JOIN ......
      JOIN ...
      LEFT JOIN ......
      WHERE tbl.id > ?
) where rownum = 100000;