We currently have performance issues as I’m sure most data-driven systems do have the same problems.
Currently, they basically fall into 2 categories that I think a single solution can solve:
Stored procedures sometimes get automatically recompiled in the system with a bad plan, that causes it to run really slowly. The reason for this is that the set of parameters that it first gets recompiled with are not representative / normal / optimal. This then causes the stored procedure to run really slowly and it needs to be recompiled to pick up a better plan
Due to the dynamic nature of how SQL Server works, as a table grows, as different parts of the system maybe query it differently – the indexes need to change or a code change is required to remove sub-optimal coding, like OR’s, functions in WHERE conditions, etc.
Is there any system tables that track the cost of stored procedures?
We need to create a script, that should run for every hour for all the week (7 days) and we need to have/store the data of stored procedure (like execution time, cost of stored procedure and so on). From this, we can identify the list of stored procedures which are performing very worst and running for longer time and generate the list of stored procedures. From that, we can perform tuning on these stored procedures to improve the performance.
Start here: sp_BlitzFirst from Brent Ozar Unlimited or BrentOzarULTD/SQL-Server-First-Responder-Kit on github.
Quoting Kendra Little on her page for performance monitoring: