I have a stored proc which searches "bad" data in our application database (sql server 2016 enterprise, ~500gb data). "bad" in a business/process sense.
It normally takes 10-30s to execute. After some days it takes suddenly 5+ minutes to execute.
My current solution is to recompute all statistics and the execution time is low again:
EXECUTE dbo.IndexOptimize @Databases = 'myDB',
@FragmentationLow = NULL,
@FragmentationMedium = NULL,
@FragmentationHigh = NULL,
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y'
Index Optimize by Ola Hallengren
Obviously, regenerationg the stats is leading into a new and better query plan. Is there a targeted way to identify the misleading statistic which introduces the slow query plan? Or how can I find the cause for this? Tables, indexes, stats and this stored proc are complex so I cannot guess it. Can I maybe programmatically compare the statistics "before" and "after" the updating?
We have many filtered indexes which are usually very tiny so the 20% rule may apply to them frequently.
The indexes are optimized weekly.