We’ve seen a lot of hype -- as well as some professional hype management -- on the matter of cloud computing for the smart grid over the past few years.

Sure, lots of today’s smart grid systems run on an IT backbone of servers that are owned and operated by vendors, not utilities. That’s kind of like a private cloud service, if by cloud you mean “someone else’s servers.”

But that’s a restrictive conception of cloud computing, which is really more about turning over ever-more complex computational and analytical tasks to tons and tons of servers all over the place. Utilities are understandably nervous about relying on the results of that kind of process to manage real-time, critical IT assets like customer billing, AMI, demand response, or grid operations.

That doesn’t mean that cloud computing isn’t being used by utilities, however. One big area of interest is in the realm of data analytics -- managing and merging the flood of smart grid data with other forms of data to deliver information that can help a utility save money or improve operations, from grid controls to customer relations. And some hardcore smart grid projects are complex enough to require the kind of computational power that only the cloud can provide.

That’s the news from a Thursday panel at The Networked Grid 2013 conference dedicated to big data, analytics and cloud computing’s impact on the smart grid. So far, that impact has been relatively small, according to GTM Research utility surveys. But as Thursday’s panelists pointed out, it’s growing quickly, both based on the economic advantages it provides and the increasing trust enabled by proof-of-concept deployments.

“We think there are significant opportunities for cloud-based analytics” in the smart grid, Brad Williams, Oracle’s vice president of industry strategy for its utility business, said. Oracle did a survey last year that revealed that nearly half of utilities that have deployed smart meters haven’t yet installed an MDM system to manage it, and that only a quarter or so had gone on to deploy functions like outage management, power quality measurement or enterprise business operations integration.

“One question we asked was what are the biggest impediments to leveraging the data,” he said. “The number-one response was, we don’t have the people who know how to do big data analytics, as well as know the business. […] You could hire data scientists, but they don’t understand the utility business context.”

“That told us there was an opportunity for services to come in and fill that role,” he said. Building on its big market share in meter data management and back-office utility software, Oracle has begun to layer on new features enabled by its cloud-based analytics, such as transformer health monitoring, outage detection and restoration and data quality improvement. (We’ve seen similar analytics features being rolled out by utility back-office competitor SAP, system integrators like Infosys, IBM, Accenture, CapGemini and Logica, and big grid companies like General Electric with its Grid IQ Insight analytics platform, to name a few.)

Tasks like these, Williams said, “require access to multiple sources of data, where utilities don't tend to have their own organizational platform to do these types of analytics. Doing something in the cloud makes a lot more sense.” Oracle bought utility data analytics startup DataRaker last year to boost its offerings in this field.

Utilities aren’t the fastest movers when it comes to new technologies, and they’ve also traditionally been held captive to proprietary, single-vendor solutions for their grid needs. One big opportunity for cloud-based data analytics is in integrating all these disparate legacy systems with all the new, smart grid technology being deployed out there, said Andrew Tang, senior vice president of business development and strategy for smart grid big-data startup AutoGrid Systems,

“Smart meters are just one value component,” he said. Combining that with data streams from throughout the utility, as AutoGrid is doing for such partners as Austin Energy, Palo Alto, Calif.’s municipal utility and Silver Spring Networks, is “where we can really unlock the analytics value of the grid.”

Money is another critical data point for utilities to stir into the big data mix, of course. AutoGrid has found that utilities are very interested in using data to back up such measures as return on investment for smart grid projects. We’re not just talking about calculating ROI for one project, of course, but rather assessing its worth against a number of competing alternatives for how to spend that money -- a job that could get really big, really fast.

AutoGrid says its platform can perform super-complex calculations like these in a matter of minutes, rather than the days it can take using utilities’ current data management methods, via an analytics engine that underpins a set of applications tailored to utility needs like demand response program management or customer engagement. For utilities that want to have this kind of complex, real-time data integration and analysis as part of what they do, “I don’t think utilities can ignore the cost economics” of the cloud, he said.

There are some key challenges to broader adoption of cloud computing by utilities, of course. One that Williams mentioned was the investor-owned utility (IOU) preference for capital expenses (like buying their own servers) on which they’re allowed to earn a fixed rate of return, as compared to operational costs (like buying a cloud-hosted service), which are always under budget pressure, he said.

Still, “I think that’s beginning to change, particularly when you start to look at the value propositions,” he said. Municipal and cooperative utilities, which have to deal with voters or members rather than passive ratepayers and state utility regulators in getting their capital budgets passed, are another potential growth market for cloud services, he noted.

Kevin Meagher, CTO of Power Analytics, described a whole new set of functions enabled by the cloud on the cutting edge of real-time power management. Power Analytics, formerly named EDSA, has its roots in building microgrid systems for the U.S. Navy, and is working on a number of microgrid projects around the world.

But it’s also worked on devising mathematics methodology models for the Department of Energy’s SunShot solar program, providing operations and planning management for data centers, and tying the home energy control platform from San Antonio, Texas-based startup Consert (bought by Toshiba-owned smart meter giant Landis+Gyr last month) to the cloud platform used to manage Texas’s power grid markets -- all features that involve cloud computing.

Wholesale migration of utility IT responsibilities to the cloud is a much more difficult challenge, however. Beyond any technical and business challenges, there’s the regulatory environment to consider, Meagher noted. How, for example, does a utility interact with a campus microgrid that’s tied into its grid -- as a customer, as a generator, as a demand response resource, or as a grid stability problem?

“In the end, [a microgrid] does have some attractive components” for utilities, he said. “You can call it a virtual power plant, and hide all the complexity behind it […], but that requires a lot of knowledge and capability on the user side that typically doesn’t exist.”

That makes for some interesting cloud opportunities as well. Concepts for integrating distributed energy resources into the grid, such as “transactional energy” markets in which smart devices commune at high speeds to share energy and price data and alter usage patterns to suit, would certainly require some heavy computational lifting to get off the ground. Those seem just the kind of tasks the cloud is made to handle. That, of course, highlights the question, whose cloud will be handling it?

The Soft Grid: Big Data, Analytics, Cloud Computing and the Grid (recorded at The Networked Grid 2013)