Sure, Facebook, Google, Yahoo and other internet giants can build their own super-efficient data centers, but what are the rest of us to do? The vast majority of data center operations don’t even know how much power they’re using per server, let alone how to optimize its use.

Enter startups like Power Assure, which has announced an interesting project with application delivery optimization company A10 Networks that helps illustrate the step-by-step methods needed to squeeze more capacity and efficiency out of the data center environments that most of us work with today.

Santa Clara, Calif.-based Power Assure is tackling a wide set of data center power challenges, including automating data center power management, establishing standards for measuring server power use across multiple use cases and balancing data center loads across entire regions.

But its project with A10, a maker of software and hardware that helps its customers optimize application delivery, was tackling a more prosaic challenge: how to help a fast-growing company manage a homegrown data center that had outstripped the building’s ability to power it.

A10 doesn’t run massive data centers delivering up huge internet applications for the world, Todd Kleppe, senior director of worldwide operations, told me in an interview last week. Rather, it runs a lab full of servers that are constantly testing its software’s ability to automate the web-serving capabilities of its customers, he said.

Those labs are modular, constantly switching servers in and out in a variety of test configurations, which makes it hard to settle on a standard efficiency approach, he said. At the same time, they’re running “insane amounts” of computing load to replicate customer environments, he said. “We need to generate an enormous volume of traffic just to make our devices sweat,” Kleppe added.

As a fast-growing startup, managing data center power wasn’t exactly on A10’s top priority list, he noted. So by the time the company realized it had a problem on its hands, it was because their current facility was tapping out its ability to deliver enough power to keep the lab running, he said.

The company turned to Power Assure, first, to “give us visibility into what was going on -- we were three blind mice,” he said. “Second, they had an enormous amount of expertise that, frankly, we did not have in-house.”

“They have helped us get a handle on our current data center, both in terms of power utilization and consumption, and what’s going on in it any point in time,” he said. That, along with help in laying out the existing data center with hot-cold aisle containment and other typical efficiency upgrades, has allowed A10 to keep running its IT lab without having to buy or build new data center space, and to cut power use roughly in half, Kleppe said.

Brad Wurtz, president and CEO of Power Assure, called the A10 project a “very good example of typical problems in data centers and how to deal with them” -- even if it doesn’t use the entire range of his company’s capabilities, such as automating server functions to optimize energy use.

Let’s face it, most data center operators care far less about their power bills than about keeping their critical processes running. In fact, for most data centers, power consumption usually becomes a concern only when you can’t get any more of it, he said.

“The overwhelming goal is to avoid building that big new data center, because that’s where the big costs are,” he said -- about $1,000 to $2,000 per square foot in today’s market. Avoiding those capital costs provides a good business case for investing in first finding out how a data center uses power, and then reducing it, he said.

Of course, Power Assure is not unique in seeking to help data center operators obtain, measure and understand how computing and power use are related. The market is being tackled with technology platforms from startups and server giants alike.

San Francisco-based startup Sentilla is building software to help customers predict and plan their data center build-out needs across hosted, virtualized and cloud computing environments. Viridity, the startup that Schneider Electric bought in December, makes software that gathers and maps out data center heating and cooling data to relate it to server use. On the IT front, server giants like IBM, HP, Cisco and Oracle are pledging to incorporate energy awareness into their management platforms.

As for data center cooling and power delivery systems, they are the target of companies like Vigilent (formerly Federspiel Controls) and SynapSense, as well as giants like Trane, General Electric, Schneider and ABB -- the latter an investor and partner with Power Assure.

But just how deep do data centers want to get into their power use -- or more importantly, how much are they willing to spend on it, based upon expected returns on that investment? Beyond managing capacity bottlenecks, justifying the investment simply on power bill savings is a hard sell today, Wurtz noted.

Still, Power Assure’s automation capabilities can be quite useful in terms of secondary benefits beyond shaving power costs. Take customer NASA, which uses the company’s technology in part to be able to ramp down server activity to prevent an overuse of power from tripping a mission-critical data center offline.

Features like that could play a role in the new data center A10 is planning to build, Kleppe noted. While the two companies haven’t announced how they might cooperate yet on that project, “We see the opportunity to much more effectively manage our use of power going forward, through detection of what’s in use and what’s not -- which can be fairly complex in our space -- and bringing things up and down only when they’re actually needed,” he said.

For startups like Power Assure, getting customers to move past analysis and planning and into ongoing power management will be critical. It’s hard to see any startup competing with the established giants in the field when it comes to helping customers plan out more efficient data centers. Providing ongoing, software-as-a-service-based control and optimization, on the other hand, could help push data centers across the board -- not just showcase projects from the likes of Facebook -- into a new realm of efficiency.