Menu
Menu
How to keep your datacentre cool next summer

How to keep your datacentre cool next summer

There are some simple tools and best practices that IT departments can employ to keep datacentre temperatures — and budgets — under control

With summer coming, it is natural for IT shops to be concerned about keeping their datacentres cool. But with limited budgets, keeping the datacentre operating efficiently without blowing the utility bill out of the water can be a challenge. It turns out there are some simple tools and best practices that IT departments can employ to keep datacentre temperatures -- and budgets -- under control.

Mind the gaps

First and foremost, say industry professionals, the journey to datacentre cooling efficiency starts with an assessment of the physical integrity of the facility. While it seems like an obvious fix, all too often datacentres have gaps throughout that allow air to escape and penetrate the facility.

"I've been in datacentres that have folding, garage-like doors with two centimetre gap under them," says Joe Capes, business development director for cooling at Schneider Electric, a global provider of energy management products and services.

In addition to leaky windows and doors, another common drain on cooling efficiency is ventilation or ceiling tiles that have been removed and not replaced. "One of the simplest things to do is properly seal the room," Capes adds, as a means of "preventing any type of penetration of outside air through doorways, windows, and so forth." Brush strips, blanking panels and even curtains can be deployed to provide what is essentially weather-stripping. Pay particular attention to cable cutouts and the space under CRAC units.

For datacentres in locations where humidity is of particular concern, it's critical to ensure that the vapor barrier -- the plastic or metal coating for the walls, ceiling and floors that keeps moisture at bay -- remains intact. Many datacentres were designed and built with vapor barriers; over time, as equipment is brought in and moved around, the holes that are punched throughout the facility to accommodate conduits compromise the vapor barrier, thereby allowing humidity in and out.

While a vapor barrier provides some protection against air flow leakage, this is not the primary benefit of keeping the barrier intact. "Depending on how precisely you want to maintain the humidity in your space and not waste a lot of energy to humidify or dehumidify it, the vapor barrier becomes critical," according to Dave Kelly, director of application engineering for the Liebert Precision Cooling products, a business unit of Emerson Network Power.

Monitor and measure

It is tough to make efficiency improvements in datacentre operations without knowing how equipment is performing and where heat is generated. "In many datacentres, there is a lot of legacy equipment that is doing nothing," asserts Don Beaty, president of DLB Associates, an engineering firm in Eatontown, N.J., that designs and builds datacentres. As new equipment is installed, Beaty says it's fairly typical for existing equipment to remain on board.

This problem has its roots in the IT/facilities disconnect: IT people tend to keep equipment running out of fear of disrupting mission-critical operations, while staff on the facility side focus on energy issues. By installing monitoring equipment on racks and PDUs to measure the load and power consumption, a datacentre can then identify any equipment that is running inefficiently in terms of capacity or even unnecessarily in terms of the applications that are on it.

"Let's say five racks that are measured all have utilization down under 50 percent," Beaty says. "This would increase the possibility of considering virtualizing those five racks, or consolidating in another way to reduce power consumption."

Optimize the supply air temperature

For years, datacentres have operated under the premise that the cooler, the better. Today that's not always the case, even in the summer. However, due to an increase in the solar load, datacentre operators tend to decrease the set points in the AC units in the summer months.

"The reason many do this is that they want to have a supply air temperature of 21 degrees, so as the temperature drifts up towards 23 degrees, they turn on the AC higher," Capes says. The most recent guidelines from ASHRAE (the American Society of Heating, Refrigerating and Air-Conditioning Engineers) recommend a supply air temperature of up to 27 degrees, yet many datacentres continue to operate at much cooler temperatures.

"All too often, engineers or facilities operators run their datacentres based on the return air temperature of the CRAC unit," says Capes. And while there might be plausible reasons to do that, "all server manufacturers and ASHRAE care about is the supply air temperature to the inlet of the server."

In addition to raising the temperature on your CRAC units, Capes says it may be possible and certainly is simpler to turn some of them off altogether. A datacentre that has 300 kW of cooling installed with a 400 kW UPS system running at 25 percent of capacity has three times the amount of cooling that is needed. "In some cases, you can put units on standby and have no effect on the environment at all," Capes says.

General housekeeping

Maximizing cooling efficiency also requires regular maintenance and perhaps a few interior changes.

If a datacentre has a raised floor, make sure the space under the floor is as clear as possible. Often when IT disconnects cables, they drop them below the floor where they may inhibit air flow to the point where fans have to work extra hard. And to reduce the overall work an AC has to do, it makes sense to locate cooling as close to the workloads as possible, whether this means moving perimeter units to the end of a rack row or adding supplemental localized cooling.

Air conditioning units themselves also won't function efficiently when dirty, so make sure to clean the outdoor heat exchanges and the filters on indoor units. And if a datacentre has windows, drawing blinds or installing a room darkening film can reduce solar load. Lighting typically contributes about 4% of the total heat load of a datacentre, Capes says; adopting LED lighting is an easy way to achieve a quick pay back.

"It is astonishing how many datacentres I go into that have not done some really simple things and best practices to improve cooling efficiency," Capes says. By following just a few tips now, datacentres can achieve noticeable cooling improvements before summer's end.

* Megan Santosus is a business and technology writer based in Natick, Massachusetts.

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

Tags data centersdatacentresummercool gadgettemepraturecooling

Show Comments