0

I'm studying for the CCSP exam and am currently reviewing The Uptime Institute Tiers. The tiers themselves make sense but from a practical perspective, I'm curious if any standards/regulations explicitly require the use of a specific tier? Or if it's "good enough" to have compensating controls to make up for data center deficiencies.

For example, let's say that instead of purchasing services in a single Tier 2 or 3 DC, I deployed resources in multiple Tier 1 data centers (thus potentially offsetting the deficiencies with required interruptive maintenance, etc). I realize this is a bit of an oversimplification but I think it still illustrates the point: Are there standards that explicitly state "thou shalt use a minimum of Tier X datacenters!"

Mike B
  • 3,336
  • 4
  • 29
  • 39

2 Answers2

1

Are there standards that explicitly state "thou shalt use a minimum of Tier X datacenters!"

You won't find any standards or government rules on that. You might find similar language in contracts where a business/government is setting an expectation for you to meet, but there are no regulations on the matter.

As for the example you provided, "good enough" is determined by a Risk Assessment of your business and infrastructural needs. Try this example:

You have flagship application which is responsible for 2/3 of your companies revenue. You teams regularly test fail-over scenarios across your data centers. One evening, while your team is testing a fail-over, a storm takes out the utility to the data-center. The data-center is tier 1, so it doesn't have redundant power. Thus it's completely down until power is restored.

  • How much man hour were lost because teams could not test?
  • What if the transfer failure caused an error which a DB could not be roled back until power came back on?
  • How many man hours are lost because your data-center team needs to go restart all the services?
  • What if the power comes back on and the surge blows a breaker?

A tier 1 might be fine for a small business who could be down for a day an survive, but a companies size increases, the expense for downtime on a system does up, not just in man-hours, but also in lost revenue.

Shane Andrie
  • 3,780
  • 1
  • 13
  • 16
1

This really ties into the organizations BIA results and what are considered "critical" resources, specifically to that organization. The only real standard (not official, but consistent with other information) is that medical and/or monitoring services "should" utilize a Tier-4 datacenter.

As an applicable scenario, the hypothetical media streaming service 'Netflicks' would probably be inclined to utilize a Tier-4 datacenter based on their user impact if services were to be offline (planned or unplanned). This would have a direct impact on their bottomline, but also lower the service's reputation for media streaming.

On the other hand, the CCSP study resources suggest that a Tier-1 datacenter could be used as a warm/hot site that is available, for contingency purposes. In this case, it just needs to be 'ready' but not 'always-on' (so to speak).