Follow Us

We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message

Case Study: Power struggle: How IT managers cope with data centre power demands (part 2)

CIOs say power and cooling are their biggest data centre problems

Article comments

This is part two of a two-part article. The first half was published yesterday.
The other limiting factor is cooling. At both ILM and Trinity, the equipment with the highest power density is the blade servers. Trinity uses eight-foot-tall racks. "They're like furnaces. They produce 120-degree heat at the very top," Roberts says. Such racks can easily top 20kW today, and densities could exceed 30kW in the next few years.

What's more, for every watt of power used by IT equipment in data centres today, another watt or more is typically expended to remove waste heat. A 20kW rack requires more than 40kW of power, says Brian Donabedian, an environmental consultant at HP. In systems with dual power supplies, additional power capacity must be provisioned, boosting the power budget even higher. But power distribution problems are much easier to fix than cooling issues, Donabedian says, and at power densities above 100W per square foot, the solutions aren't intuitive.

For example, a common mistake data centre managers make is to place exhaust fans above the racks. But unless the ceiling is very high, those fans can make the racks run hotter by interfering with the operation of the room's air conditioning system. "Having all of those produces an air curtain from the top of the rack to the ceiling that stops the horizontal airflow back to the AC units," Roberts says.

Trinity addressed the problem by using targeted cooling. "We put in return air ducts for every system, and we can point them to a specific hot aisle in our data centre," he says.

ILM spreads the heat load by spacing the blade server racks in each row. That leaves four empty cabinets per row, but Bermender says he has the room to do that right now. He also thinks an alternative way to distribute the load -- partially filling each rack -- is inefficient. "If I do half a rack, I'm losing power efficiency. The denser the rack, the greater the power savings overall because you have fewer fans," which use a lot of power, he says.

Bermender would also prefer not to use spot cooling systems like IBM's Cool Blue, because they take up floor space and result in extra cooling systems to maintain. "Unified cooling makes a big difference in power," he says. Ironically, many data centres have more cooling than they need but still can't cool their equipment, says Donabedian. He estimates that by improving the effectiveness of air-distribution systems, data centres can save as much as 35 per cent on power costs.

Before ILM moved, the air conditioning units, which opposed each other in the room, created dead-air zones under the 12-inch raised floor. Seven years of moves and changes had left a subterranean tangle of hot and abandoned power and network cabling that was blocking airflows. At one point, the staff powered down the entire data centre over a holiday weekend, moved out the equipment, pulled up the floor and spent three days removing the unused cabling and reorganising the rest. "Some areas went from 10 [cubic feet per minute] to 100 cfm just by getting rid of the old cable under the floor," Bermender says.

Even those radical steps provided only temporary relief, because the room was so overloaded with equipment. Had ILM not moved, Bermender says, it would have been forced to move the data centre to a co-location facility. Managers of older data centres can expect to run into similar problems, he says.

That suits Marvin Wheeler just fine. The chief operations officer at Terremark Worldwide manages a 600,000-square-foot collocation facility designed to support 100 watts per square foot.

"There are two issues. One is power consumption, and the other is the ability to get all of that heat out. The cooling issues are the ones that generally become the limiting factor," he says.

With 24-inch floors and 20-foot-high ceilings, Wheeler has plenty of space to manage airflows. Terremark breaks floor space into zones, and airflows are increased or decreased as needed. The company's service-level agreements cover both power and environmental conditions such as temperature and humidity, and it is working to offer customers Web-based access to that information in real time.

Terremark's data centre consumes about 6MW of power, but a good portion of that goes to support dual-corded servers. Thanks to redundant power designs, "we have tied up twice as much power capacity for every server," Wheeler says.

Terremark hosts some 200 customers, and the equipment is distributed based on load. "We spread out everything. We use power and load as the determining factors," he says.

But Wheeler is also feeling the heat. Customers are moving to 10- and 12-foot-high racks, in some cases increasing the power density by a factor of three. Right now, Terremark bills based on square footage, but he says co-location companies need a new model to keep up. "Pricing is going to be based more on power consumption than square footage," Wheeler says.

According to EYP's Gross, the average power consumption per server rack has doubled in the past three years. But there's no need to panic—yet, says Donabedian.

"Everyone gets hung up on the dramatic increases in the power requirements for a particular server," he says. But they forget that the overall impact on the data centre is much more gradual, because most data centres only replace one-third of their equipment over a two- or three-year period.

Nonetheless, the long-term trend is toward even higher power densities, says Gross. He points out that 10 years ago, mainframes ran so hot that the systems moved to water cooling before a change from bipolar to more efficient CMOS technology bailed them out.

"Now we're going through another ascending growth curve in terms of power," he says. But this time, Gross adds, "there is nothing on the horizon that will drop that power."


Share:

More from Techworld

More relevant IT news

Comments




Send to a friend

Email this article to a friend or colleague:

PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.

Techworld White Papers

Choose – and Choose Wisely – the Right MSP for Your SMB

End users need a technology partner that provides transparency, enables productivity, delivers...

Download Whitepaper

10 Effective Habits of Indispensable IT Departments

It’s no secret that responsibilities are growing while budgets continue to shrink. Download this...

Download Whitepaper

Gartner Magic Quadrant for Enterprise Information Archiving

Enterprise information archiving is contributing to organisational needs for e-discovery and...

Download Whitepaper

Advancing the state of virtualised backups

Dell Software’s vRanger is a veteran of the virtualisation specific backup market. It was the...

Download Whitepaper

Techworld UK - Technology - Business

Innovation, productivity, agility and profit

Watch this on demand webinar which explores IT innovation, managed print services and business agility.

Techworld Mobile Site

Access Techworld's content on the move

Get the latest news, product reviews and downloads on your mobile device with Techworld's mobile site.

Find out more...

From Wow to How : Making mobile and cloud work for you

On demand Biztech Briefing - Learn how to effectively deliver mobile work styles and cloud services together.

Watch now...

Site Map

* *