6

When either selecting a data center as a co-location facility or designing a new one from scratch, what would your ideal specification be?

Fundamentally, diversified power sources, multiple ISPs, redundant generators, UPS, cooling, and physical security are all desireable.

What are the additional key requirements that someone might not consider on the first pass?

What are the functional details someone might not consider during the initial high level design?

I'd like to approach this from the perspective of designing a large data center or seeking a facility that was designed perfectly from an infrastructure perspective. This question has already been addressed with smaller facilities and workspace considerations here.

Warner
  • 23,440
  • 2
  • 57
  • 69

6 Answers6

6

I've been lucky enough to have built a few over the years, certainly I'd look to the following points;

  • I'd always go for multiple sites, even if that meant having smaller sites in total.
  • You're right about multiple diversely routed/sourced power supplies and good UPSs, physical security etc.
  • I won't be buying any more AC units for my data centres again, choosing to partially filter the ambient air and use semi-sealed extraction tunnels/pipes/channels to pull the hotter air out - possibly with some form of heat exchange to recoup some of the energy from the heated air. This approach saves a fortune, is 'greener', can support higher watts/rack and much more reliable/available.
  • I'd use solid concrete floor, not raised flooring, this will obviously support higher loads (i.e. fuller racks) with overhead caging carrying mostly OM3 fibre and a few cat5/53/6 coppers for where they're absolutely required.
  • I'd go for fewer, faster, trunked but resilient links to my servers/switches/blades etc. than the old-school waterfall of lower-speed links.
  • With the cost of disk-based CCTV solutions getting cheaper and cheaper I'd cover every row or position in the place and record everything.
  • Every site needs two non-equipment areas - an area in the server room that's fenced off from the racks with a desk and storage for kit and tools, a chair, power, networking etc. - and a second area outside the server room to make calls and get away from the noise.

I hope this is of help, I might add some more later.

Chopper3
  • 100,240
  • 9
  • 106
  • 238
  • Have you discovered a proven solution for cooling alternatives? – Warner Mar 22 '10 at 17:25
  • 1
    Yes, but there's nothing to discover as such. 99%+ of IT kit actually PREFERS to work at ambient temperatures yet we keep firing in cold air that costs a lot of money. The problem is the build-up of hot gases from the rear of racks - all that's needed is to remove that. Some of the largest new data centres in the world simply suck out the hot air, you need to ensure you have some degree of sealing around the rack (nothing silly) then you just have fans on the roof instead of AC units. Have a look here; http://www.theregister.co.uk/2008/05/24/switch_switchnap_rob_roy/ – Chopper3 Mar 22 '10 at 20:37
5

One thing that I don't see already posted is the budget to be able to able to build a very good team of people.

I recently went cage shopping and found that they pretty much all were peered with multiple tier-1 providers, multiple diesel generators, etc.

What made me pick the one I did was that everyone there was sharp and dedicated, there were plenty of people on location, the sales managers and projects managers were also great. All the generators and peerings in the world won't help if the guys plug you all into the same generator, or the remote hands don't respond when you really need them to.

So this may not fall under infrastructure, but in the end it can be more impressive than four vs two redundant generators, 2 vs 3 Internet peers, etc.

Kyle Brandt
  • 82,107
  • 71
  • 302
  • 444
  • Yeah, that doesn't really fall under "design", except maybe accounting for high salaries and good bennies in the operating budget to attract and keep good staff. That's more of a corporate "thing", instead of DC design. – mfinni Mar 19 '10 at 20:59
2

There's good information in the answers to this question. Many of the questions tagged server-room are relevant, too.

Ward - Reinstate Monica
  • 12,788
  • 28
  • 44
  • 59
1

Location , power costs, water costs/supply (depending on how the facility is being cooled), weather, and natural disasters is what I would add on top of your list.

xeon
  • 3,796
  • 17
  • 18
1

Diverse fibre/connectivity routes (including multiple, separate building entry points)

James
  • 7,553
  • 2
  • 24
  • 33
0

I'm sure that the answer will be "it depends." Are you asking for blue-sky, if you had millions of dollars and were trying to run Amazon? Is there a budget you have in mind?

Ease of expansion is one thing not on your list. If you're renting a cage, how easy is it to add another cage and get the proper wiring between them? If you're building your own, what do you do when you run out of floorspace? Can you destroy offices and expand within the building, can you knock down a wall and make the building bigger?

mfinni
  • 35,711
  • 3
  • 50
  • 86
  • I was writing up an analysis and started thinking about how I would design from scratch, began to wonder if smart people would have ideas that I didn't. One main site has reached physical capacity and we're expanding. Down the road, I'll likely have the opportunity to build another space from scratch using existing space. Right now, I'm picking between data centers. Amazon scope is bigger than I'm thinking but is definitely not outside of consideration, as those ideas filter down. – Warner Mar 19 '10 at 19:27