0

I have a stored proc which searches "bad" data in our application database (sql server 2016 enterprise, ~500gb data). "bad" in a business/process sense.

It normally takes 10-30s to execute. After some days it takes suddenly 5+ minutes to execute.

My current solution is to recompute all statistics and the execution time is low again:

EXECUTE dbo.IndexOptimize @Databases = 'myDB',
 @FragmentationLow = NULL,
 @FragmentationMedium = NULL,
 @FragmentationHigh = NULL,
 @FragmentationLevel1 = 5,
 @FragmentationLevel2 = 30,
 @UpdateStatistics = 'ALL',
 @OnlyModifiedStatistics = 'Y'

Index Optimize by Ola Hallengren

Obviously, regenerationg the stats is leading into a new and better query plan. Is there a targeted way to identify the misleading statistic which introduces the slow query plan? Or how can I find the cause for this? Tables, indexes, stats and this stored proc are complex so I cannot guess it. Can I maybe programmatically compare the statistics "before" and "after" the updating?

We have many filtered indexes which are usually very tiny so the 20% rule may apply to them frequently.

The indexes are optimized weekly.

fabsenet
  • 101
  • 1
  • 2
    I think you might reach a wider audience for this at [dba.stackexchange.com](https://dba.stackexchange.com/). –  Jun 22 '18 at 14:44
  • didn't know, this existed! [cross posted there](https://dba.stackexchange.com/questions/210581/find-outlier-statistic-in-sql-server). thx – fabsenet Jun 26 '18 at 07:21

1 Answers1

0

Not sure, what the real answer is (yet), but after the suggestion of @yoonix, I moved the question over to dba.stackexchange.com

fabsenet
  • 101
  • 1