67

What do I have to consider when I'm planning a new server room for a small company (30 PCs, 5 servers, a couple of switches, routers, UPS...)?

What are the most important aspects in order to protect the hardware? What things do not belong in a server closet?

Edit: You may also be interested in this question: Server Room Survival Kit.

Thank you!

splattne
  • 28,348
  • 19
  • 97
  • 147

19 Answers19

60
  • Enough space for expansion
  • Plenty of network ports
  • Sufficient network bandwidth
  • Plenty of dedicated power sockets
  • Should not be on the ground floor (risk of flooding + less secure)
  • Fire suppression facilities + smoke alarms
  • IP KVM for remote access
  • Telephone (so the operator can call a support line while looking at the hardware)
  • Pens + paper
  • A label printer - label everything!
  • A standard printer (nice to have)
  • Spare network and power cables
  • Air conditioning (also dehumidifies)
  • Good UPS (with automated/controlled shutdown functionality)
  • Sufficient power to run everything (and enough for expansion)
  • Entrance security (preferably also with logging)
  • Physical security (security on windows, entrance, etc.)
  • Whiteboard (nice to have)
  • Fireproof safe (for storing backup tapes, passwords and installation media)
  • Good server racks - well maintained (cabling)
  • Enough space to work comfortably behind the servers
  • A table large enough to build/dismantle a server on (plus monitor, keyboard and mouse)
  • At least 1 chair
  • Tidy patch panel (especially if you patch to PC's and telephones in the office)
  • Good lighting
Techboy
  • 1,540
  • 7
  • 31
  • 49
  • 6
    Re: "Plenty of network ports", "Plenty of power sockets", "Enough space to work comfortably behind server"... can't emphasize those points enough! You can't have too much. – Matt Hanson May 03 '09 at 21:17
  • 3
    +1 on the telephone jack alone. +more if I could. – RBerteig May 06 '09 at 08:04
  • 1
    Those power sockets should be dedicated; in other words, home-runs back to the breaker box and not daisy-chained like regular power sockets. Dedicated power sockets are often differentiated by orange receptacles. – Scott May 07 '09 at 21:43
  • 3
    Orange receptacles are _not_, as a rule, dedicated. Orange indicates that they are isolated ground. Although, when I just checked [wikipedia][1] to verify this, they say that the color of the receptacle is no longer regulated, but receptacles with an orange triangle on them are isolated ground. Still orange receptacles are typically isolated ground, and red are used to indicate backup power. [1]: http://en.wikipedia.org/wiki/NEMA_connector#Color_code – sherbang May 23 '09 at 18:25
  • 1
    I would agree with everything on the list... BUT never, ever, put anything plastic in a fire proof safe. It probably won't survive, and it will ruin everything else that does. If it's that important, send it offsite. – Joseph Kern Jun 14 '09 at 12:17
  • 1
    And you should also add a sharpie/permanent marker. Nothing is worse than 200 unmarked CDRs. – Joseph Kern Jun 14 '09 at 12:18
  • @Joesph, slightly out of my range of expertise here, but I have spoken to someone in the safe business, he says there are "media safe's" that will safely store CD's, DVD's etc... without melting. I have noticed that these are becoming more standard (saw one at costco last month) – Nathan Koop Aug 04 '09 at 20:07
  • I would also add "Raised floor" so you can route cables underneath – Nathan Koop Aug 04 '09 at 20:15
  • 3
    The safe should be Fireproof *and* Waterpoof. Where there is fire, there is water. – chris May 03 '10 at 18:47
27

You might want to place a small shelf near the entrance to put a pair of these

alt text
(source: a-chainsaw.com)

Glorfindel
  • 1,213
  • 3
  • 15
  • 22
Peter Stuer
  • 1,473
  • 9
  • 11
23

In my experience, the ideal server room will have the following:

  • Large enough to house your cabinet or rack. You should have at least 4 ft. of walking space in front and back, ideally all around. If you can get away with it, plan for the possibility of a second rack in the future.
  • Secured. You don't necessarily need an armed guard, but at least a good lock. A biometric or card swipe is always good. Home depot has locks that use touch pads so you can assign codes to unlock the door.
  • Usually, the server room is also the telco's entry point (demarc), so you'll have your T1 smartjack's there, your PBX or phone system, etc. We usually dedicate one wall and put up plywood so telco's and providers can mount their equipment.
  • Air conditioning is a given. You need to keep the room at around 65 - 75 degrees. A dedicated thermostat is preferred since you don't want the A/C to be shut off in the server room on weekends or at nights.
  • Power is extremely important. Since your rack is most likely in the middle of the room, you will have cables going across the floor to reach the outlet if they are wall based. If you can have the outlets put on the floor, that's best. If you can't, use some cable covers to avoid tripping over wires. Get dedicated circuits put in for a clean line of electricity. Make sure you have extra outlets on all walls, in a pinch, having access to an outlet can be critical, especially if you need to plug in a laptop or other device.
  • Keep a small cabinet or shelves where you can store manuals, cables, spare cards, drives, etc. You want this in an easy to access place during installations and troubleshooting. Keep this out of the way in the room, but accessible.
  • Cable management is critical as well, both in the rack and from the plywood wall. Over time it gets very easy to just plug cables in. If the cable management is there, it's easier to keep things organized and label/mark both ends of all wires, the last thing you want to do is trace wires when your network is down.
  • For the cabinet itself make sure you have adequate UPS's, cabinet cable management, a good KVM, a 1U slide-out keyboard/mouse/LCD to save on rack space and plenty of ventilation. Cabinet design is a whole dissertation in of itself!
  • If the room is closed off, make sure you have proper ventilation for air flow. You'll need some kind of intake vent so hot air can escape. If needed, use a fan to suck the air out. For fresh air, you can put a vent on the door.
  • Definitely a phone near the cabinet with a list of support numbers, "911" contacts, etc.
  • If I can, I try to have a place to hook up a laptop close by so you can access tools, test against another working system, test client software, etc.
  • And there's nothing wrong with a chair for when you are waiting on hold for that tech support rep to come back on the line :)

There's a lot that can go into a server room, if you can get away with a lot of this, your life as an admin will be so much better. The easier it is to get to equipment, trace the setup and get your problems solved, the more effective you can be. Good luck!

splattne
  • 28,348
  • 19
  • 97
  • 147
John Virgolino
  • 687
  • 8
  • 17
15

I was just watching the film 'Eagle Eye' - apparently the perfect server room involves covering the walls with oddly-coloured fishbowls, which talk via infrared (???) to your main 'CPU', which itself moves around on a robotic arm with a glowing 'eye' set into the middle. Oh and build the whole thing over a large body of water too, this will help in some way ;)

Chopper3
  • 100,240
  • 9
  • 106
  • 238
7

Look at everything from a risk management point of view and everything will fall into place.

  • Physical security: What is at risk if a malicious (or ignorant) individual gains access to the server and network hardware? Who will have the permission to enter? Server hardening required? (disable removable drive bootup, BIOS password, disable USB, etc.)
  • Climate control: 5 servers and 30 PCs won't make incredible heat in, say, a 20x20 room, but that's a bit much if you're stuffing it in a coat closet. Running at elevated temperatures and/or humidity will shorten the life of your hardware and lead to data loss and expensive replacements. Consider simple ventilation with a dehumidifier or possibly A/C system sized for your needs.
  • Business continuity: Battery backup? Data redundancy? Fault tolerant LAN/WAN connections? Any single points of failure in your infrastructure? Do you have enough excess power to run your infrastructure and not blow a fuse if someone plugs in a vacuum cleaner?
  • Growth: Have a contingency plan in place for when management demands you double, nay, triple your infrastructure. How will all the critical dependencies scale?
spoulson
  • 2,173
  • 5
  • 22
  • 30
6

In addition to Techboy's excellent list:

  • Toolbox including cable tester, screw drivers, pliers, hex keys and anything required for rack installations, including shifting spanners and the like.
  • Cordless phone - with speaker phone if such a beast exists
  • Trolley for moving servers
  • Ladders/steps/stepladders for working at top of racks and ceiling mounted cable runs
  • Consider trolley with lift for installing servers near the top of racks, especially if you have strict OH&S regulations or just care about your workers.
  • Hearing protection - we're required to wear it in our DC and they supply disposable ear plugs.
  • Power points outside the racks so you can plug in laptops etc easily without disturbing in rack power.
  • SNMP or other remote controlled power rails
  • Power rails on each side of a rack should go to a different dist panel so that work can be done without losing power to both power supplies in your servers
  • Video cameras for additional security
  • A row of hooks in a secure place either in or near the DC so that people who work in there regularly can leave a jacket
  • Storage for spares, blanking plates, cables and the like
  • Possibly spares - especially if your vendor(s) can be persuaded to provide you with common spares in advance of warranty claims/failures (some vendors/re-sellers under some circumstances will provide you with say a couple of hard drives and power supplies. They belong to the vendor, but you can use them if you have a failure and worry about the paper work after the fact.)
splattne
  • 28,348
  • 19
  • 97
  • 147
Jason Tan
  • 2,742
  • 2
  • 17
  • 24
6

From what I am reading here most people are going for massive over kill. You have 5 server and 30 work stations, so since this is a small company by the sounds of it I very much doubt the boss/owner will spring for a biometric scanner, pass card scanner and video system as some examples unless you already have these and it would be a cheap add on for the server room.

So I did one almost the same as what you had, 20 workstations & 8 servers.

Here is what I found went well and was a good cost on the typical limited budget of a small company.

  • Phone or at least a phone jack where you can move a phone in should you have to call tech support while in front of the physical server
  • A/C, even with 8 server my room temp was about 31 degrees Celsius. For this since the room was in the center of the building and new duct work was out of the question we got one of the portable room ones and exhausted it into the office on the side of the wall closest to the cold air return for the building a/c unit. This worked well and dropped the temp to about 23 degrees but it does take up a lot of space but is really good.
  • Don't bother with a rack mount monitor/KVM, I can think of better things to spend $1k on.
    • Get a monitor and a 10 ft VGA cable. Have a table or wall mounted shelf next to the servers, any modern rack mounted server has found mounted VGA port
    • Get a wireless keyboard/mouse that use the same dongle. Just move the VGA cable and wireless dongle to the computer you need access too. Since you will be accessing everything remotely 90% of the time its not a big deal to do it this way
    • If you can get a rack and use rack mounted servers, BUT make sure you get rails that allow the server to be slid out should you need to access the hardware rather then one then needs to be unbolted
    • If you can't get a rack, go to home depot and get the heavy duty freestanding utility shelves, this will work just as well for desktop units
    • Lock, a key or keypad one is fine, you'll have to give a key and the combo to your boss or owner anyway, make sure the servers log who logs into them and it doesn't matter as much who is in the room.
    • If the phone gear is there use the plywood on the wall idea
    • Put the phones on their own power souce UPS
    • Get a UPS that is expandable for the servers
    • Get a UPS that has load banks you can remotely switch off should you need to kill server remotely (Tripplight has these, saved me a few times when I had a wonkly server that would lock up and needed the power killed) Make sure you set the bios to power on after a power outage for all servers. The UPS will only have a few plugs so you have to put your critical servers on it, for me it was a DC/GC and email, this allowed me to reboot those if they crashed
    • Some shelves for parts and the other "stuff" you'll be required to keep in there by the boss (lol unless you are the boss)
    • Dedicated power for the A/C
    • Dedicated power for the Phones
    • Dedicated power for the servers I had a totally of 6 15amp circuits in mine, plus the lights
    • Make any patch cables custom and have them as short as they can be, this will have keep it organized so you don't have them dangling everywhere.
    • If you cannot put it on the second floor, put all the servers on the top rack and work your way down. mine was on the second floor so not an issue for me.

This is what I had in mine, network wiring, etc was all labeled and well organized as well, how that setup is dependent on where it comes in.

Biggest things, keep it organized, and make sure you have room to expand for future growth.

splattne
  • 28,348
  • 19
  • 97
  • 147
LEAT
  • 217
  • 1
  • 2
  • 5
    WHAT!!!! Fill from the top down???? I hope your life insurance is paid up and doesn't have a "Darwin" clause. Unless the racks are anchored (to more than a raised floor panel) you should fill from the bottom up, to reduce the chances of tipping. – Brad Bruce May 31 '09 at 01:38
  • 1
    All racks should be anchored to the floor period. Doesn't matter where the equipment is in them. And of course make sure the rack can take the weight by it's specs. Sorry I forgot to mention that. Its a matter of mitigating risk, if you cannot put your server rack on the second floor or say its stuck in a basement (as one client I have is setup) then you don't want to start at the bottom which would be most prone to flooding. If you cannot anchor the rack, then UPS's in the bottom to weight them and the servers higher up. I think I've also seen racks with extra feet like out riggers – LEAT May 31 '09 at 04:24
  • 2
    You do need a KVM (preferably IP KVM) because you don't want to spend time messing about around the back of servers and you also want the capability to work from home (e.g. evenings/weekends) – Techboy May 31 '09 at 14:56
  • We've opted for using out-of-band cards (eg Dell DRAC5) for our KVM needs, rather than a dedicated IP-KVM or physical KVM. Too many wires cluttering up the rack with a physical USB/PS2+VGA KVM. We have a small monitor + keyboard with long cables in each environment for emergencies. – Mike Pountney Jun 06 '09 at 02:28
  • Techboy, rack mount servers these days have font and back VGA port and front and back USB, so there is no messing around the back of the servers. It works great I did it for years in a server closet that didn't have room behind the rack. And adding room wasn't an option. Purchasing an 18 port KVM wasn't an option either due to cost. And why do I need a KVM to work from home or even remotely in the office? +1 Mike, too many wires with physical KVM's. – SpaceManSpiff Jun 06 '09 at 04:39
  • 3
    Fill from the bottom up. Cold air falls, heat rises. Fill your 'empty' RUs with spacers. This keeps the hot air in the hot aisle. You do have it setup as hot/cold aisles, right? – toppledwagon Aug 28 '09 at 19:49
5

alt text

alt text
The NOC

alt text
Above left: The submarine engines used for backup power. Above right: Another view of the power equipment.

alt text
Above: This map shows the layout of the data center.

Link to original article

Rook
  • 671
  • 6
  • 15
5

Think about DR planning from the get-go. If I had it to do all over again, I'd have separated the network gear (switches and routers) from the servers, and had a shorter (30u is great) so I could have just rolled the rack out the door and onto a truck that time when Verizon couldn't replace a backhoed T1 line for FOUR FREAKING DAYS and we had to move the servers to our DR location. (Not that I'm still bitter.) Also, depending on location, it's a good idea to have at least one 208 or 240V circuit and space to put the spot cooler when a hurricane knocks out your HVAC in August in Texas.

user6622
  • 186
  • 1
  • 1
  • 2
    As someone who has rolled a loaded 30U rack on to a truck, it's definitely *not* something I would want to do ever again for any reason. It really messes with the elevators having that much weight on the rack, and the moving companies aren't equipped to deal with it, even if they are. We tried this when we were moving offices and I have to say it was the scariest experience of my professional life, watching the rack containing all our expensive equipment getting loaded on to the back of a huge truck and bouncing around on the pavement in the process. Better to unrack your gear and move it. – Kamil Kisiel Jun 20 '09 at 19:39
  • IMO you still need switching in the rack with your servers. Otherwise there are too many cables going back to the switching rack, with more connectors to unplug. +1 for spare cooling too. – Criggie Jul 01 '20 at 00:56
5

Sterile Environment Is A Must, you must have acceptable flooring, don't even go near carpet as it collects dust. Typically i would opt for raised flooring, is a good option as it's easy to run cable.

Don't Skimp On Cheap Data Racks, with a good cable management system, run one side data and the other power, as some countries regulations don't allow mixing data and power.

Elijah Glover
  • 183
  • 1
  • 7
  • 7
    Running cable under raised flooring is a really bad idea. The ability to hide bad practices, especially with things like wiring, encourages bad practices. Buy an overhead cable management system. – duffbeer703 May 09 '09 at 20:55
  • 1
    I agree with duffbeer703. Raised floors have fallen out of style. When our data center was last upgraded, the admin at the time INSISTED on raised floors. Now when ever anything gets moved half the floor has to be pulled up. We had to add an additional fire suppression zone just for the under floor area. Heavy equipment has to be lifted to get it into the raised area (and many more problems). After that admin was fired. The next one moved the network wiring to ladder racks and we're much happier. Not worth moving the power though. 1 outlet per rack (under the rack...) – Brad Bruce May 30 '09 at 16:41
  • 2
    +1 just for the comment on cheap data racks. I love APC racks purely because they have U-number markings on every face. It's amazing how easy it is to get that wrong. – Mike Pountney Jun 06 '09 at 02:17
  • "Sterile Environment Is A Must" -- sterile meaning aseptic in this context? I believe you mean a low-dust environment; hospital operating rooms are nearly aseptic, and it's hard work to get them to that point. –  Nov 28 '09 at 13:16
3

alt text

alt text

from the CNET News article "Google uncloaks once-secret server' about Google Data Centers." :)

Glorfindel
  • 1,213
  • 3
  • 15
  • 22
balexandre
  • 557
  • 2
  • 13
  • 25
  • 1
    If I order it, does it come with its own nuclear power plant? I think this could slighlty exceed my budget. ;-) – splattne May 11 '09 at 11:58
  • eheheh :) but it's good to see what 'others' are doing, this will never cross my mind, only after seeing it and read about it... it's like a portable Data Center ;) – balexandre May 11 '09 at 13:54
2

Thick glass windows so you can see when someone is in the server room and what they are doing.

Look for water stains on the floor and ceiling. Do not put your most critical equipment under said water stains. I have actually seen this on more than one occasion.

Makes sure once the room is finished that there is no room for a desk. Otherwise a space strapped business will turn your nice server room into someone's not very functional office.

Bolt your racks down. There is nothing worse than adding a 1U server and the whole rack falls over.

Do not use a 50ft network cable to connect a server to a switch 2ft away.

A lease that stipulates that the landlord will not shut down air conditioning on the weekends just to save money. Imagine walking into a 150 degree server room Monday morning.

Like other people said, label everything. Labelwriters are great for permanent labels; masking tape and sharpies work too.

2

We are a small company with 6/7 servers and a 6/7 development workstations, when we moved office we got a good solid shelving unit and put it in the middle of the room to house our servers/printers.

Boy was this a good idea, as we have had to go around the back several timesand as the servers arent rackmounted it would have been very difficult and potentially dangerous to move them.

A good air-con is a must.

Decent and plentiful power.

Dan
  • 193
  • 2
  • 7
1

From my experience, the computer room is the only place at work you can find some peace and no one will come bother you so I'd suggest you get some commodities for in there in case you want some seclusion.

Necessary items are a comfy chair and headphones :)

fim
  • 497
  • 2
  • 5
1

You'll need:

  • good air condition, so temp and humidity stay stable
  • good power supply, possibly redundant and safe if you external power supply is disrupted
  • security at the entrance, so not everybody can enter
boutta
  • 231
  • 1
  • 3
  • 8
1

Perhaps take a look on Google best practices :

They released a video about their "container approach" which, even if only very few companies may have money to do so, have good idea for smaller business (for example, focusing on very efficient power supplies and so on).

Jeff likes that subject too (about energy) !

Also, Google released some informations about their servers which are quite interesting by, for example, the embedded UPC in each server.

paulgreg
  • 4,094
  • 6
  • 31
  • 32
1

Add to existing lists:

Toolkits - one larger one (power screwdrivers always good) for server room work, one portable one for taking around to PCs. Never mix them and you will always have the tool you need at hand

Filing cabinet - good for manuals, printouts of receipts, license agreements, etc.

Cable management and patch panels are a must - start your room out with everything going to patch panels and your life will be easier in the long run when you add to the system.

And lastly, take the time to set it up right the first time - what others have mentioned about spacing the racks, tables, shelves apart, cable management, etc. If you start out with a complete wreck of a room, it's probably going to just get worse over time.

Schmitty23
  • 181
  • 1
  • 3
0

My list is:

  • Physical space
  • Physical security
  • Sufficient cooling
  • Sufficient power to run everything
  • Sufficient network bandwidth

These are the things that are hardest to change at a later date and so you need to get them right early.

David Locke
  • 153
  • 3
  • 7
0

Not a great deal to add to the excellent answers above, but if possible I'd add a second server room, for backup/redundancy.

Provided it's far enough, or protected enough, from the other server room, then you gain a great degree of protection against the less serious disasters that may occur - a server fire causing a whole rack failure, for example.

Naturally, this does not replace a full disaster recovery plan or adequate backups - but does provide a cheap second layer of defence if required.

Mike Pountney
  • 2,443
  • 2
  • 20
  • 15