Universal Connection Pool Memory Management

4.1k views Asked by At

I have been trying to upgrade my ojdbc code from ojdbc14-10.2.0.1.0 to ojdbc6-11.1.0.7.0. We have been using OracleConnectionCacheImpl for datasource connections and then moved to the Universal Connection Pool using OracleDataSource at the heart. Here is how we currently have it configured in Spring:

<bean id="myDatasource" class="oracle.ucp.jdbc.PoolDataSourceFactory" factory-method="getPoolDataSource">
        <property name="URL" value="@JDBC_URL@"/>
        <property name="user" value="@JDBC_USERNAME@"/>
        <property name="password" value="@JDBC_PASSWORD@"/>
        <property name="connectionFactoryClassName" value="oracle.jdbc.pool.OracleDataSource"/>
        <property name="connectionPoolName" value="MFR_RTE_POOL"/>
        <property name="minPoolSize" value="5"/>
        <property name="maxPoolSize" value="100"/>
        <property name="validateConnectionOnBorrow" value="true" />
        <property name="connectionWaitTimeout" value="30"/>
        <property name="connectionHarvestMaxCount" value="25"/>
        <property name="connectionHarvestTriggerCount" value="5"/>
        <property name="maxStatements" value="100"/>
 </bean>

It took a bit to get it to run without closed connection errors, but now I have an issue with memory management. I have run jconsole against an application I have that uses a ThreadPool. This application uses a Thread Pool and uses ThreadPoolExecutors to create fee requests based on the data passed from the file. A file can have hundreds of thousands of fee requests. My issue is that long term memory in the Heap is filling up and is not releasing objects. In the performance test I have set up, long term memory in Garbage Collection is filling up in about 20-25 minutes and does not ever free up. The application eventually hits the GC Limit Exceeded Exception and comes to grinding halt.

When I run the same test using the old OracleConnectionCacheImpl class it just runs with no problem. Granted the thread pool and all accompanying code was written to run using older versions of Spring (1.2.6) and old ojdbc driver, but is there really that big of difference in the way OracleConnectionCacheImpl works versus Universal Connection Pooling? Am I looking at rewriting my domain model if I want to accommodate the latest versions of Oracle's JDBC driver code. I have tried OracleDataSource connection and it failed miserably with NullPointerExceptions after working on several files concurrently. I then went to UCP (at the suggestion of another post in this forum) which works fine in all but one application. At this point I'm trying to figure out whether I can further optimize the Spring config bean for my datasource or do I need to start thinking about upgrading the code base. As stated previously, this code runs very well against the old ojdbc class, but I have had issues every step of the way trying to implement UCP. I'm startg to wonder if its even worth upgrading.

1

There are 1 answers

0
JavaDev03 On

This problem had bugged me for months, I hope what I came up with helps someone else out there:

I did finally figure out a solution to my issue. Instead of using OracleDataSource as the connection factory :

<property name="connectionFactoryClassName" value="oracle.jdbc.pool.OracleDataSource"/>

I would suggest trying OracleConnectionPoolDataSource:

 <property name="connectionFactoryClassName" value="oracle.jdbc.pool.OracleConnectionPoolDataSource"/>

OracleConnectionPoolDataSource extends OracleDataSource and seems to do better in applications where multiple connections need to be opened by multiple resources. In my case I have an application that requires processing multiple batch files. The same SQL code is run over and over, but the application needs a new connection for each new file. Under these circumstances OracleDataSource often times had failed connection errors or some sort (e.g. SQLException: closed connection, NullPointerException: connection closed with or without UCP), lead to issues with Garbage Collection (long-term GC would fill up and cause GC to ultimately fail no matter how much memory I added to the JVM).

I found OracleDataSource to work well on applications that do not do use a lot of batch processing. For instance another application I use is a file processing application but it only works on one file at a time. OracleDataSource works great in this circumstance. It also seems to work well for Web Applications as well. We have a web app we installed OracleDataSource in 9 months ago and has had no issues.

I'm sure there are ways to make OracleDataSource work as well as OracleConnectionPoolDataSource, but this is worked for me.