Free Cooling: the Server Side of the Storyby Johan De Gelas on February 11, 2014 7:00 AM EST
- Posted in
- Cloud Computing
- IT Computing
- Ivy Bridge EP
Hurdles for Free Cooling
It is indeed a lot easier for Facebook, Google and Microsoft to operate data centers with "free cooling". After all, the servers inside those data centers are basically "expendable"; there is no need to make sure that an individual server does not fail. The applications running on top of those servers can handle an occasional server failure easily. That is in sharp contrast with a data center that hosts servers of hundreds of different customers, where the availability of a small server cluster is of the utmost importance and regulated by an SLA (Service Level Agreement). The internet giants also have full control over both facilities and IT equipment.
There are other concerns and humidity is one of the most important ones. Too much humidity and your equipment is threatened by condensation. Conversely, if the data center air is too dry, electrostatic discharge can wreak havoc.
Still, the humidity of the outside air is not a problem for free cooling as many data centers can be outfitted with a water-side economizer. Cold water replaces the refrigerant, pumps and a closed circuit replace the compressor. The hot return water passes through the outdoor pipes of the heat exchangers. If the outdoor air is cold enough, the water-side system can cool the water back to the desired temperature.
Google's data center in Belgium uses water-side cooling so well that it
does not need any additional cooling. (source: google)
Most of the "free cooling" systems are "assisting cooling systems". In many situations they do not perform well enough to guarantee the typical 20-25°C (68-77 °F) inlet temperature the whole year around that CRACs can offer.
All you need is ... a mild climate
But do we really need to guarantee a rather low 20-25°C inlet temperature for our IT equipment all year round? It is a very important question as the temperature in large parts of the worlds can be cooled with free cooling if the server inlet temperature does not need to be so low.
The Green Grid, a non-profit organization, uses data from the Weatherbank to calculate the amount of time that a data center can use air-side "free cooling" to keep the inlet temperature below 35°C. To make this more visual, they publish the data in a colorful way. Dark blue means that air-side economizers can be efficient for 8500 hours per year, which is basically year round. Here is the map of North-America:
About 75% of North-America can use free cooling if the maximum inlet temperature is raised to 35°C (95 °F). In Europe, the situation is even better:
Although I have my doubts about the accuracy of the map (the south of Spain and Greece see a lot more hot days than the south of Ireland), it looks like 99% of Europe can make use of free cooling. So how do our current servers cope with an inlet temperature up to 35 °C ?
Post Your CommentPlease log in or sign up to comment.
View All Comments
extide - Tuesday, February 11, 2014 - linkYeah there is a lot of movement in this these days, but the hard part of doing this is at the low voltages used in servers <=24v, you need a massive amount of current to feed several racks of servers, so you need massive power bars and of course you can lose a lot of efficiency on that side as well.
drexnx - Tuesday, February 11, 2014 - linkafaik, the Delta DC stuff is all 48v, so a lot of the old telecom CO stuff is already tailor-made for use there.
but yes, you get to see some pretty amazing buswork as a result!
Ikefu - Tuesday, February 11, 2014 - linkMicrosoft is building a massive data center in my home state just outside Cheyenne, WY. I wonder why more companies haven't done this yet? Its very dry and days above 90F are few and far between in the summer. Seems like an easy cooling solution versus all the data centers in places like Dallas.
rrinker - Tuesday, February 11, 2014 - linkBuilding in the cooler climes is great - but you also need the networking infrastructure to support said big data center. Heck for free cooling, build the data centers in the far frozen reaches of Northern Canada, or in Antarctica. Only, how will you get the data to the data center?
Ikefu - Tuesday, February 11, 2014 - linkIts actually right along the I-80 corridor that connects Chicago and San Francisco. Several major backbones run along that route and its why many mega data centers in Iowa are also built along I-80. Microsoft and the NCAR Yellowstone super computer are there so the large pipe is definitely accessible.
darking - Tuesday, February 11, 2014 - linkWe've used free cooling in our small datacenter since 2007. Its very effective from september to april here in Denmark.
beginner99 - Tuesday, February 11, 2014 - linkThat map from Europe is certainly plain wrong. Especially in Spain btu also Greece and italy easily have some day above 35. It also happens couple of days per year were I live, a lot more north than any of those.
ShieTar - Thursday, February 13, 2014 - linkDo you really get 35°C, in the shade, outside, for more than 260 hours a year? I'm sure it happens for a few hours a day in the two hottest months, but the map does cap out at 8500 out of 8760 hours.
juhatus - Tuesday, February 11, 2014 - linkWhat about wear&tear at running the equipment at hotter temperatures? I remember seeing the chart where higher temperature = shorter life span. I would imagine the OEM's have engineered a bit over this and warranties aside, it should be basic physics?
zodiacfml - Wednesday, February 12, 2014 - linkYou just need constant temperature and equipment that works at that temperature. Wear and tear happens significantly at temperature changes.