The road to riches in the smart grid will be paved with pennies.
So says Andres Carvallo, a man who has worked on the public (Austin Energy) and private (Grid Net and Proximetry Networks) sides of the smart grid. That means he knows how stodgy utilities can be with their money, and how demanding they can be with their suppliers.
Carvallo was on stage Tuesday at The Soft Grid 2012 conference in San Francisco to talk about cloud computing and the utility “big data” challenge. GTM Research projects that U.S. utilities will spend $8.2 billion on enterprise IT from 2011 to 2015 as they struggle to manage the flood of data that’s coming from smart grid deployments and integrate it into legacy IT systems.
But just because utilities desperately need help in achieving their enterprise IT goals doesn’t mean they’re going to spend willy-nilly on them, Carvallo said. With stimulus money fading fast and the first generation of smart meters under attack from various quarters, utilities are refocusing on smart grid technology they can deploy at a cost they can justify to investors, regulators and the public alike, he said.
That means that utilities will adopt cloud computing platforms and tools only to the extent that they offer specific solutions to specific problems, and not as a wholesale change to the way they do business. “This industry will definitely push vendors to world-class innovation, one penny at a time,” he said.
Of course, it’s important to remember that lots of the smart grid already runs on the cloud, he noted. Silver Spring Networks uses a private cloud to manage its millions of smart meters for utility customers, and Austin Energy has hired Microsoft and Oracle to do its heavy IT lifting, he said.
But utilities also have their own reasons to keep as much of their IT work in-house as possible. Investor-owned utilities in particular are under pressure to spend on IT capital expenses, which they can recover in rate cases, rather than on operational expenses like cloud services that don’t offer such guaranteed payback, Linda Jackman, group vice president of product strategy and management for Oracle’s utility sector, said.
Beyond that, utilities want to develop their own IT talent and tools to manage their very specialized IT challenges. Those range in complexity from crunching terabytes of smart meter data for monthly billing and move-in/move-out management, to real-time power flow analysis for energy management and distribution grid management software.
Only some of these problems can be handled by today’s technology, Carvallo noted. Take real-time power control systems like SCADA networks, which have to be able to “talk” at the speed of the grid itself, which is 60 hertz, or about one cycle every 17 milliseconds. The only industries with such stringent latency requirements are Wall Street and the aerospace industry, he said.
That means that distributed intelligence will be key to getting more complex computing tasks done on the grid itself, according to Kevin Meagher, CTO of microgrid analytics firm Power Analytics (formerly EDSA). Most power controls intelligence will have to be at the point of use, rather than cloud-based, for example.
Still, the cloud can host plenty of analytics tools that can offer faster time to deployment and much greater scalability, he noted. Beyond that, there are plenty of basic tasks like customer management and billing that are crying out for modernization, Carvallo said. Likewise, utilities can take tested technology from industries like banking and telecommunications to help with transactional data management, data warehousing and deep analytics tasks.
Utilities may want to spend and hire their way to competence in such big-data fields as NoSQL object-oriented data management, service-oriented architecture middleware, enterprise information management and cybersecurity. Or they can farm out the work to trusted partners, who can capture the economies of scale that come from running a single IT platform for multiple customers, rather than each one doing their own.
“What doesn’t change about cloud or software as a service is the ability to have a common application,” said Ed Abbo, CTO of C3. The stealthy startup founded by Siebel Systems billionaire Tom Siebel is using its private cloud to run software that analyzes, manages and tracks energy efficiency across portfolios of buildings, with a customer list that includes Pacific Gas & Electric, GE Energy, SAIC, Hewlett Packard, Constellation New Energy and Masdar City.
In May, C3 bought home energy efficiency IT startup Efficiency 2.0, which gives it reach into several million homes. But it can host the big commercial-industrial stuff and the home energy contest stuff on the same platform, saving utilities integration and duplication costs, he said.
Meanwhile, the very definition of a utility is shifting in ways that make terms like 'outsourced' versus 'in-sourced' hard to define, Carvallo said. For example, lots of power authorities or municipal utilities "rent" their distribution and transmission assets from other parties, in a form of outsourced engineering, he said.
Evolving models of generating and delivering power, like campus microgrids or net-zero energy homes, will continue to offer room for innovation on the IT side. C3’s Abbo noted that GE Energy is looking at how it can use C3’s building-side energy analysis platform to inform the way they plan and design systems that work on the grid itself, for example.
In the meantime, GE has launched a smart-grid-as-a-service business that uses GE’s data centers to manage lots of smaller utilities’ smart meter, demand response and distribution grid management systems for them. Lots of other smart grid vendors are working on similar service-style business models.
At some point, whether they’re run on the utility’s own servers or someone else’s becomes less important than whether they do what they’re supposed to do, and at the right price.