The Environmental Protection Agency released its long awaited Energy Star label for data centers on Monday, likely sparking a new round of data-center energy-efficiency improvements.

The initiative should be well received by IT managers, who have long awaited a standard benchmark for energy consumption. "The industry will be happy with some forward movement," acknowledges Cory Stine, director of data-center services at the engineering firm Bluestone Energy Services of Norwell, MA.

However, the decision will focus attention on the failings of the selected benchmark, known as the Power Usage Effectiveness metric, and will trigger a debate over how best to calculate efficiency.

The EPA decision has been expected for almost a year, and some data center improvements have languished in anticipation. "The industry needed a little bit of a catalyst," concedes Mark Harris, vice president of product marketing at San Francisco data-center software supplier Modius.

With the Energy Star label now a corporate goal, centers will likely begin to compete with one another. The EPA's plan is to award the label only to the top quarter of centers in a particular industry. The agency also put into place safeguards against gaming. PUE calculations need to be verified by licensed professionals and examined by the agency before accepted.

Data center experts say the EPA will need six to nine months to develop industry-by-industry benchmarks. The EPA didn't immediately respond with a schedule.

The trouble is that the PUE is not the only, or necessarily the best, method for calculating data-center energy use. The metric is a measure of how much data-center power goes directly to computing equipment, and is thereby a gauge of how much work is done. It is presented as a ratio of total power to computer power.

While it is relatively simple and easily understood, the metric falls short in at least one key area. It doesn't assure that the energy accomplishes useful work, such as processing transaction volume. A server might be idling, not handling a request, and its energy would still be counted as effective power.

A more targeted benchmark under development -- the DCeP -- compares useful work to total power consumption, but has proved a difficult formula to derive. It is likely several years away.

The PUE, "is a good first step," says Harris. "The issue is that five years ago we had nothing." Now there is something.

But the debate is sure to continue. And it will spread publicly, instead of remaining confined to the Green Grid, which proposed the use of the PUE. Now that companies have an incentive, they will take an interest in how the calculation is performed. The EPA also may find itself under pressure to come up with a set of best practices.

Without detailed guidelines, the PUE can be gamed, asserts Stine. Every center needs to report total power in the same way, for instance.

Clearly, the new Energy Star label has broad consequences. Data centers are significant consumers of energy and together account for 1.5 percent of U.S. electricity use. Energy consumption in the industry, already $4.5 billion, is expected to nearly double over the next five years.

Up until now, data centers have been addressing energy efficiency in a piecemeal manner. Blade servers might replace dedicated boxes and old routers could be swapped out for new ones. IT managers may choose to slow fan speeds in their chillers. But many did not calculate their PUE rating, despite the attention Google brought to the measure by publicizing its own rating. (Google claims its PUE is 1.21 -- a rating of 1 is perfectly efficient.)

Now there is an incentive to view data-center energy management more holistically. It comes not a moment too soon.

Tags: cisco, data center, data center efficiency, data centers, doe, efficiency, energy efficiency, ge, google