One of the panels I will be moderating at next week's Networked Grid will focus on smart grid data and analytics.  Data analysis is a field that has been rapidly evolving in a wide swath of industries over the past 20 years. Titled "Beyond Meter-to-Cash: Improving Business Processes with AMI Data and Analytics," the panel brings together senior executives from several companies actively involved in meter data management and analysis: Aclara, eMeter, Energent and Elster. Turning raw smart meter data into actionable business intelligence is an area where the soft grid is very immature -- a key finding of a report on meter data management that we issued last year. To put things in perspective, most industries have followed a 20-year evolution in data and analytics, and we expect the utility industry to follow a similar evolutionary path -- albeit within a condensed timeframe.

The advent of relational database technologies in the mid-1980s was quickly followed with a succession of data management technologies, each with their own buzz phrase.

First, pioneers in 'data warehousing' established a complex portfolio of data management technologies to extract, transform, move and load (ETML) data. Unfortunately, many early large-scale data warehouse projects suffered from the over-specification of the nebulous. In other words, they were exercises in boiling oceans of data. Note that the aquatic metaphor is a handy one for understanding the current state of the smart grid, as indicated by the popular and somewhat cliche use of the term 'data tsunami.' An apt conclusion is that the utility industry -- and the vendors that innovate and supply it -- are in the early stages of identifying the types of data available from the smart grid and its potential uses: a circa 1990-1992 challenge for most other industries.

Formal data warehousing was quickly superceded by lighter-weight, faster-moving "data marts."  However, vendors hip to this "let your database administrator hair down" approach quickly learned that industry doesn't want 'marts' (unless they are interested in selling salty snack foods at a gas station). The end-game is actually "business intelligence," in other words, turning data into usable information by adding context.  What ensued was wave after wave of user-friendly tools for creating ad hoc queries, reports and so forth. Later, in somewhat recursive fashion, structured programming re-entered the game -- this time to create dynamic interactive data routines to match expressed areas of user interest with related data objects, also known as 'correlative filtering,' a capability initially popularized by Amazon with its product recommendations and subsequently refined and propagated across the internet for use in advertising, shopping, social media and so forth.  In many ways, correlative filtering is the "killer data management app" of the internet.

Next week, I will be asking the panelists for their ideas on what constitutes the killer smart grid data management app.