65

When I was working in our server room, I noticed that it was very cold.

I know that the server room has to be cold to offset the heat of the servers, but perhaps it is TOO cold.

What is an appropriate temperature to keep our server room at?

Fahad Sadah
  • 1,496
  • 11
  • 21
freddiefujiwra
  • 1,627
  • 5
  • 25
  • 32
  • 1
    It depends. Many modern servers are perfectly happy with 45 degree celcius operating temperature. USV's have to go out theough - battteries do not like that. But for server... that may be ok. Old machines choke on that temperature. – TomTom Oct 18 '12 at 12:53

16 Answers16

54

Recommendations on server room temperature vary greatly.

This guide says that:

General recommendations suggest that you should not go below 10°C (50°F) or above 28°C (82°F). Although this seems a wide range these are the extremes and it is far more common to keep the ambient temperature around 20-21°C (68-71°F). For a variety of reasons this can sometimes be a tall order.

This discussion on Slashdot has a variety of answers but most of them within the range quoted above.

Update: As others have commented below Google recommends 26.7°C (80°F) for data centres.

Also the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) has recently updated their recommended tempature range to be from 18°C-27°C (64.4°F-80.6°F).

However this article agains highlights that there is still no consensus on the subject. As mentioned in the article I would highlight that:

...nudging the thermostat higher may also leave less time to recover from a cooling failure, and is only appropriate for companies with a strong understanding of the cooling conditions in their facility.

IMO most companies would not have such a strong understanding of cooling conditions and thus it would be safer in a small business environment to be running the rooms a little cooler.

NB: It is important to note there are a lot more factors to consider in a server/data room than just the temperature, air flow & humidity for example are also important concerns.

Phil Hollenback
  • 14,647
  • 4
  • 34
  • 51
mundeep
  • 916
  • 8
  • 10
  • 1
    Google is actually recommending looking into even bumping it up to 80°F for energy savings. There's a good article here: http://www.datacenterknowledge.com/archives/2008/10/14/google-raise-your-data-center-temperature/ – gharper May 22 '09 at 20:34
  • 1
    Thanks i'll update the answer to mention that, my original answer was more geared at everyday small office server rooms - where IMO more leeway in case of cooling failure may be required (or a greater focus on airflow etc.. – mundeep May 23 '09 at 05:37
  • 36
    Something important to keep in mind is that Google operates on an expected failure model - instead of expensive, fault tolerant servers like most companies have, Google instead uses highly customized fault tolerant software. Google fully expects its servers to die on a regular basis (but for the software to route around the failure). So in Google's costing model it may make sense to run their server rooms at 80 degrees, because the increase in (very cheap) hardware failures is easily offset by the energy savings. Are a few dead $6,000 servers worth some savings on electricity in your company? – David May 25 '09 at 11:48
  • what's wrong with <50°F? I'm curious, because I know a couple datacenters in 'extreme' environments just vent to the outside for cooling... and anyone remember Kryotech? the first company to get 1Ghz consumer PCsbut super-cooling the Athlon 600Mhz processors to -40° – warren Oct 07 '09 at 04:50
10

Google's datacenter best practices recommends 80 degrees:

Raising the cold aisle temperature will reduce facility energy use. Don't try to run your cold aisle at 70F; set the temperature at 80F or higher -- virtually all equipment manufacturers allow this.

We run at 72, but then again I don't exactly trust our own room was designed with airflow in mind.

Franck Dernoncourt
  • 940
  • 1
  • 12
  • 28
jldugger
  • 14,122
  • 19
  • 73
  • 129
8

As others have said, somewhere in the low 70's F is good. However, it's even more critical to make sure the rack and the hardware in it can "breathe". If hot air is trapped in the rack - or in a server chassis - then the low ambient temperature won't really do any good.

4

My server room is set to 69 degrees. We have one air conditioning unit that services that room and it runs 24/7 to keep the room at 69 degrees year round.

GregD
  • 8,713
  • 1
  • 23
  • 35
4

All server rooms I've seen usually are between 20°C and 25°C but from experience I've noticed hardware is more sensitive to variations more than a given temperature. I've often seen hardware fail after a bump of, say 4-5°C, even if it is from 20 to 25.

So stability is a key, as well as air flow of course.

anto1ne
  • 41
  • 1
4

18°C (65°F). Its a bit colder than it has to be, but if something fails it gives us a few precious extra minutes to react before it gets uncomforably hot.

Commander Keen
  • 1,253
  • 7
  • 11
4

https://en.wikipedia.org/wiki/Data_center_environmental_control contains an interesting overview of vendor datacenter temperature recommendations:

enter image description here

Franck Dernoncourt
  • 940
  • 1
  • 12
  • 28
2

We like ours to be between 19C and 23C with alarms at 28C - we're a 98% HP blade place so in the event of ours getting to that alarm level certain servers/enclosures 'turn their wick down' to lower overall power draw (it's called thermalogic).

Chopper3
  • 100,240
  • 9
  • 106
  • 238
2

Remember that Google's advice of 80°F is including virtually no cost to shutting down a datacenter when it overheats due to unexpected load or air conditioning failure. They also include a greater control of airflow over critical components.

carlito
  • 2,489
  • 18
  • 12
2

I'm in the Navy. On our ship we kept our rooms at less than 65 degrees Fahrenheit. We'd be in the middle of the Red Sea and it would be 120 degrees outside. We'd walk out of the space (at 62 degrees) with a sweater, gloves, a coat and a peacoat and everyone would look at us like we were crazy. Everyone else was striped down to their tee shirts and sweating like crazy.

The problem is, our cooling system does not remove the humidity very well. So, if it's humid outside and the temp goes up to 70 degrees or more it starts to get sticky in the space.

Justin
  • 21
  • 1
  • 6
    Hi Justin, welcome to ServerFault. What you've written here is a fun anecdote, but it's not really a great answer to the question. An answer would have included things to consider and/or best practice recommendations. You might want to have a look at this: http://serverfault.com/questions/how-to-answer – quux Sep 12 '11 at 23:53
  • Poor Justin, a decade later, a deserved upvote. Re what @quux said, there is no such admonition for another answer essentially similar to yours by @ Chopper3 nearby, which too just describes their situation too without "things to consider and/or best practice recommendations" and minus any interesting anecdotes (and useful points such as your cooling system not removing humidity very well). Welcome to stackexchange dystopia :v – user2297550 Jul 14 '21 at 13:42
  • also better than @ gregd 's answer nearby, which too didn't earn @quux 's criticism – user2297550 Jul 14 '21 at 13:47
1

In my experience, it depends on the equipment. I know one company had three server rooms. One was kept about 80°F, the other, with fewer servers, was at about 72°F, and the third, with only two servers, was at about 65°F. Never had a problem in the time I was there, but, like others have said, 75°F or slightly below is probably best, since it gives a little elbow room if the AC goes out.

Joshua Nurczyk
  • 738
  • 6
  • 17
0

Anything over 30c at air inlets we've found causes temperature warnings (which we monitor), In general that means an ambiant room temperature below 25c.

In Australia it seems Emerson (Liebert) decided that 21c would be the standard and every room I've been in here has been set the same.

If your room is big enough look at hot-aise/cold-aise, blanking tiles, and similar, they can really improve the cooling efficency.

LapTop006
  • 6,466
  • 19
  • 26
0

There is no hard and fast answer to this question. It depends a lot on the type of computers and the design of the datacenter.

One of the key points made in that slashdot discussion is that the real difficulty is dealing with hotspots and airflow.

A cool ambient temperature in the room is an easy way to cool down the thousands of hotspots in the room. But it's inefficient: The datacenter cooling the whole room when all that's really needed is to cool the hot parts of the servers (cpus, memory, storage, power supplies, etc). Many colocation facilities have customers with disparate types of hardware (115V or 208v, good designs, bad designs).

A datacenter also needs to maintain enough cool air to provide a buffer in the event of a power outage. A buffer of cool air means that UPS and generators work less. Imagine that a datacenter looses 100% of power (All nodes go down, cooling also goes down) all at once. The nodes may not have power, but the hardware will take time to cool and are still radiating heat for a while, and this can make a datacenter very hot.

Organizations with warm server rooms often use closed cabinets and fancy ductwork to direct cool air directly over the hot nodes. A specialized power infrastructure can help to reduce the heat from power conversion. This keeps the nodes cool enough, even though the ambient temperature in the server room is warm. These organizations usually have tight control over most aspects of their hardware, and can custom design the entire datacenter for their needs. They can make investments and reap the benefits.

Stefan Lasiewski
  • 22,949
  • 38
  • 129
  • 184
0

First: where are you measuring the room temperature ? ;-) What is the rack cold side temp at ? what is the rack hot side at? what is the A/C intake temp at ?

It really depends on the devices. (feel the heat that comes out of an xbox) Thermal management is a science, Look Honda ( and many others) are starting to water cool exhaust ;-) ...

I will tell you this, of the datacetenters I've been in:

Equnix keeps there DC's cold (makes for good 'experience' for when exec's go on tour)

RW keeps them warm/hot 80f or so with pretty chilly air coming out of the floor(better room/vault design and higher quality system design overall)

Coresite (atleast where I have been) is closer to RW but it depends on the CR and customers. Some CR's at 80 some at 70. (cold isles always cold)

sirmonkey
  • 76
  • 1
  • 1
  • 6
-1

The temp you're really looking to monitor is the internal server temperature - not the external ambient temperature. Watch your server temperature (all servers have internal thermometers that report through the server interface) and adjust your ambient temperature accordingly. This will account for air flow, etc. Having a back-up cooling plan is a good idea, even portable A/C units can be lifesavers if there is a primary cooling system failure.

Jason
  • 11
  • 7
    That's pretty bad advice actually. Good thing the OP probably won't see it. But ignoring ambient temperature is a great way to turn all your servers into expensive door stops when you cool the room below the dew point and get water condensing on all the metal surfaces. I highly recommend it for sabotage operations, as it's devastating and virtually impossible to prove intent. – HopelessN00b Aug 25 '12 at 05:51
-4

I have found 19-21 a good temperature. Google runs its server rooms at a slightly higher temp than you would epxect as they found the slightly higher failure rate better than paying the extra in cooling.

Peter
  • 11