0

I'm running several SQL servers, each running a few hundred multi-gig databases for customers. They are all setup homogeneously as far as the schemas are concerned, however customer usages of the data differ quite a lot from database to database.

What would be the best way to auto-index/profile/tune this large amount of databases?

As there are at least 600 or more catalogs I cant have someone manually profile, and index as required by each databases usage patterns.

I'm currently running SQL 2005 but will be moving to 2008, so solutions that work with either are fine.

Scott Pack
  • 14,717
  • 10
  • 51
  • 83

1 Answers1

2

This is something you'll never be able to fully automate, and so your best bet is to use a tool to find out where your worst performers are and therefore where to best spend your time. To that end, I'd go third party for the volume you're talking about: perhaps something like Quest Foglight or Redgate SQL Response.

As for the built-in tools, when you do find something you want to apply to many servers at a time you can use configuration master servers to apply changes to many servers at once (you probably already have this). You can also use the missing index stored procedures to help find obvious places on servers where an additional index might help performance.

Joel Coel
  • 12,910
  • 13
  • 61
  • 99