I am using Spark and Spring JDBC Template class for commiting in DB. DB is MS SQL Server. From Apache spark, mapToParition sends data in batches to DB via Springs batchUpdate in a transaction. The transaction gets completed but data doesn't get written to DB.
Does anybody know whats wrong here ?
.mapToPartition(....) {
call(..) {
DefaultTransactionDefinition paramTransactionDefinition = new
DefaultTransactionDefinition();
TransactionStatus status =
transactionManager.getTransaction(paramTransactionDefinition );
.....
..... code for sql.......
.....
_jdbcTemplate.batchUpdate(finalSql, jdbcArgs);
transactionManager.commit(status);
} // call ends
}//mapToPartition ends
Here status.isCompleted() returns true.
This code works perfectly fine if I run in local[*] mode of apache spark, but when I run same code in Apache Spark's clustered/Distributed mode with 3 workers then it doesn't write any data to DB.