I've a task to import data from Oracle database to Elasticsearch. Of course I'm using JDBC input plugin for it.
Due to performance reasons I need to enable prepared statements for a plugin. (It will reduce read operations on DB and will proper index usage)
My configuration looks as follows:
input {
jdbc {
jdbc_fetch_size => 999
schedule => "* * * * *"
use_prepared_statements => true
prepared_statement_name => "foo"
prepared_statement_bind_values => [":sql_last_value"]
statement => " SELECT
......
FROM table_name tbl
JOIN ......
JOIN ...
LEFT JOIN ......
WHERE tbl.id > ?
"
use_column_value => true
tracking_column => "id"
}
}
But here I hit a problem. After activating it:
- no events are transmitted in logstash
- no new documents in ELK are created
- CPU usage and memory consumptions is 100%.
- after some time logstash scrash with following error: java.lang.BootstrapMethodError: call site initialization exception
Few important remarks:
- it doesn't matter if I change jdbc_fetch_size to smaller or larger value (it only affects how fast memory will be consumed)
- on smaller amount of data everything works fine - documents in ELK indexes are created but with slight delay which doesn't occur when prepared statement are disabled.
- when prepared statement are disabled everything works fine even with large data and without any delay
I've tested it on two versions 7.8.0 and 7.9.0 - results on both is the same - not working.
Am I dealing here with a bug?
Java::oracle.jdbc.driver.OracleDriver (driver itself is oracle:oracle:11.2.0.4-jdk6-sp1)
It is working (long, because of number of entries returned) but it didn't crash and statements were constantly displayed (checked on oracle console).
193 millions of records
Yes, with limited records it works but it is not an option for me - I need to fetch all records and not only part of it (using
rownum
doesn't ensure it)Here problem with prepared statements is fact that it crash, I do not complain about long lasting import (I'm prepared for it - large number of records enforced it)