Greentech Media’s annual smart grid showcase, The Networked Grid 2013, kicks off today with a big focus on proving out the smart grid industry’s worth to investors, utilities and power users. In the parlance of the conference, smart grid-enabled utilities are moving from “infrastructure to insight” in managing and making the most of their smart grid investments.

Data analytics is key to this transformation. Utilities are already swimming in masses of data, flowing from the millions of smart meters, tens of thousands of grid sensor and control devices, and a myriad of other smart grid devices they’ve put in the field over the years. At the same time, they’re struggling to capture and understand all the data that can help them guide future smart grid investments, as well as making the most out of what they’ve already deployed.

A number of companies are offering big data solutions to these kinds of problems, from startups like AutoGrid, Power Analytics, Stem and GELI to grid giants like Alstom, Siemens and General Electric, to name a few. Utilities are just starting to test the capabilities of these new big data tools -- which includes using them to try to gauge just what they’re worth.

That’s how Giri Iyer, product line leader for General Electric’s Grid IQ Insight data analytics platform, described some of the projects that GE is undertaking with utilities around the country. In a recent interview, Iyer said that GE has been working with several unnamed utilities on using its Grid IQ Insight platform first as a “low-risk, high-reward consulting service,” via 90-day rapid deployment that seeks to test a utility’s smart grid plans against all the data available, to see if they make sense in the real world.  

Iyer wouldn’t name the customers, though he did describe what a few of them were doing with GE -- and some of the unexpected insights gleaned from the process:

- Take one GE utility customer that has deployed smart meters throughout its service territory, used Grid IQ Insight to detect signature patterns of electric vehicles charging on the grid. That involved the collection and analysis of demographic data, AMI data, sensor data and grid modeling, all to provide the utility information to inform its business planning on how to prepare for EV charging’s effect on grid operations.

But the analysis yielded some unexpected conclusions, Iyer said. First of all, it found that EVs weren’t going to be as popular with utility customers as had been presumed, he said. Secondly, it found that the utility’s one-hour, kilowatt-hour resolution smart meter data would completely miss the sub-kilowatt car-charging that’s likely to make up the majority of EV charging setups at people’s homes.

In other words, it discovered a gap in the available data, Iyer said. GE told the utility that it would need fifteen-minute interval data at watt-hour resolution to figure out how plug-in vehicle charging was affecting the grid in anything close to real time. Whether or not the utility will invest in that is another matter, he noted -- but at least it knows the costs of doing so versus not doing so.

- The same issues held true for another 90-day fast deploy customer that wanted more insight into its vegetation management (i.e., tree trimming) spending. GE’s Grid IQ Insight team built an algorithm to predict outage causes with over 80 percent confidence, based on weather, satellite and aerial imagery, vegetation growth data and the like, he said. But its business case analysis found that weather sensors in the utility’s area were actually too few and far between to yield the richness of data that would be required to predict specific locations of outages based on current weather patterns (weather being the most frequent cause of outages).

- The third utility Iyer discussed is using Grid IQ Insight to predict solar “hot spots,” or areas where lots of solar panels are zeroing out customer power use or even feeding power back onto the grid. GE’s data-crunching for that customer yielded a next-day solar irradiation forecasting system that hit the mark with 90 percent confidence, he said, along with visualization tools for grid operators.

While the utility hasn’t yet decided what to do with those capabilities, “we think 90 percent certainty gets us close to the realm where a utility can say, 'I do or don’t want [to buy] that two megawatts over the next fifteen minutes,'” in terms of providing distributed solar generation data that can feed into utility grid and market management systems, he said.

- All in all, findings like these are valuable for utilities that are under increasing pressure to prove to regulators that their smart grid investment plans will deliver the benefits they’ve promised. Smart grid integrators such as IBM, Accenture, CapGemini and Logica have been doing this kind of work for big utility clients for years, of course. But there are plenty of smaller utilities that can’t afford to hire top-tier integration and consulting talent to help plan their smart grid budgets and deployment schedules.

At the same time, U.S. utilities, which have been buoyed by billions of dollars of smart grid stimulus grants over the past few years, are no longer able to turn to federal matching grants to justify their favorite smart grid projects. Instead, they’re being asked to justify new projects on more stringent cost-benefit analyses, as well as to prove they’re leveraging the infrastructure they’ve already invested to maximum effect.

All of these factors seem to justify the enthusiasm in the smart grid industry for data analytics tools that can help solve these problems -- as long as they can do so at the right (i.e., low) cost, and deliver results that can be proven out in real-world experience. Keep an eye on this space for more details to come -- including, perhaps, a report from GE on which of its unnamed Grid IQ Insight customers choose to move from pre-deployment analysis to full-scale deployment of these tools in their ongoing operations.

The Soft Grid: Big Data, Analytics, Cloud Computing and the Grid (recorded at The Networked Grid 2013)