We have same stored procedure exist in Oracle 10g and 19c with same set of data and setup. The procedure does so many data fetching and manipulation. When we execute with same set of data(let say 10000 records) it works fine in 10g with less time but in 19c it takes much time and after some time it throws "Open cursor limit exceeded" error. We did the basic comparison from both the data bases for OPEN_CURSOR & CACHED_CURSOR size which is same.
What else parameters or settings we can compare from server side so as to resolve this issue?
I can't tell you what is causing your maximum open cursors problem, but I tell you how to find the cause by identifying the related sessions and SQL statement using
GV$OPEN_CURSOR
.If you're lucky you can find the problem immediately with a simple query that counts the number of open cursors per session. There are a lot of columns in the below query, use an IDE so you can easily browse all the data. In my experience, just glancing at columns like USER_NAME and SQL_TEXT is enough to identify the culprit.
Keep in mind that there will be many strange queries in that view that may make the counts larger than you anticipated. With all the recursive and cached queries, it's not unusual to have a "boring" session use 50 cursors. You're looking for sessions with hundreds of open cursors. (Unless someone foolishly lowered the parameter value below the default.)
Unfortunately,
GV$OPEN_CURSOR
does not contain historical data, and these problems can start and stop quickly if there's an exception inside a tight loop that quickly opens lots of cursors. The below PL/SQL block runs until it finds a session with a large number of open cursors, stores the data, and exits. This PL/SQL block is expensive, and will use up an entire session of processing waiting for the right moment, so only use it once to find the problem.If you want to test the monitoring, you can use the below PL/SQL block to open a large number of cursors.