Utilities are in a “death spiral,” trapped in a “vicious cycle,” and are exploring “profound transitions” to fight for their continued survival. These are the kinds of headlines that have dominated the discussion of how utilities, facing sagging demand for energy amidst the rise of green-powered, energy-independent customers, are struggling to adapt to a new business paradigm.
Amidst all this talk of transformation, however, it’s important to remember that utilities -- at least, the U.S. electricity distribution utilities that have been largely left out of the deregulation that’s swept over the energy industry over the past few decades -- are mostly stuck doing things the same way they’ve always done them.
It’s called cost-of-service regulation, and it leaves little room for innovation or risk-taking. But innovation, and the risks and rewards that come with it, are just what is called for in this new energy era, argues a report released last week from GE Digital Energy and The Analysis Group.
“The overarching approach in the past is to focus on the least-cost solution,” David Malkin, a co-author of the report and GE Digital Energy’s deputy director of government affairs and policy, said in an interview this week. “We ought to flip that equation on its head and ask, 'What is the approach that delivers the greatest long-term value to customers?'”
Under the regulatory models that have evolved over the past century of mass electrification, utilities get to charge just enough to build and maintain the infrastructure to deliver safe and reliable power to all comers, and receive a specified rate of return on those investments via what they charge their customers.
Any new spending plans that deviate from those past cost and profit figures can be challenged by multiple parties, from customer advocates fighting for low energy rates, to third-party power producers worried that distribution utility gains will lead directly to zero-sum losses on their side of the equation.
But that leaves utilities in a bind when it comes to justifying investments in smart grid technologies, which by definition are doing something that hasn’t been done before. Nor does it account well for investments to ensure reliability amidst an unprecedented growth of intermittent renewable energy coming onto the grid, or to mitigate the risks of cyber-attacks, or to support grid strengthening and resiliency to deal with extreme weather events, which are all new problems.
Greentech Media has been searching for answers to complicated new challenges like these, as laid out by our recent Grid Edge report and research focus. Our recent Q&A with Grid Edge Executive Council member Paul De Martini lays out a succinct, yet comprehensive, breakdown of the challenges involved.
The question, as always, is how to come up with new regulatory models that free up distribution utilities to take risks and reap the rewards that can come from it -- while also ensuring that their customers, who don’t get to choose who they buy their power from, don’t end up paying an unfair share for any missteps and mistakes.
The Fundamentals of Results-Based Regulation
Just what this new “Results-Based Regulation” model, as the report’s title terms it, might look like is a broader question. But we are seeing some promising ideas emerge from regulators in states across the country, as well as from overseas examples like the U.K.’s RIIO (“Revenue set to deliver strong Incentives, Innovation and Outputs”) model, Malkin said.
“I think there are a couple of defining attributes,” he said. “One is a mechanism whereby utility revenues are set based on an assessment of future costs, rather than current costs. There are a couple of ways to do that. Probably the easiest, or the most tried and true way, is to set a revenue plan, that’s spread over a couple of years, and based on a regulator’s review” -- one that allows adjustments to rates based on changing business needs to occur on an incremental basis, rather than in the multi-year rate cases that now govern most utility spending plans.
“The second piece, which flows directly from that first piece, is a strong incentive for utilities to hold down their costs, and to pursue efficiency gains during that multi-year investment cycle,” Malkin continued. “The most immediate way to do that is to allow the utility to capture some portion of the savings that they might realize if their actual costs fall below their projected costs in their business plan -- and conversely, if their actual costs exceed their projected costs, then their shareholders would be on the hook to foot at least a portion of those overruns.”
That may sound like simple common sense -- but today’s regulatory frameworks actually discourage utilities from saving more money than expected, he said. “In general, when you have a rate case every couple of years, and a utility reduces its operating costs, that forms the cost basis for when they go in for the next rate review, so they have every incentive not to do that,” he said.
In that sense, traditional utility budgeting suffers from the classic dilemma faced by government departments that try their best to spend all the money remaining to them as they approach the end of their budget cycles -- not because it’s the most efficient thing to do, but to avoid being told in their next budget request that they didn’t need that money, and are having future funds cut back.
So, how can new regulatory models free up utilities to spend and save to optimum efficiency, without giving them free rein to charge customers more money than they actually need? That requires some sort of “shared savings mechanism, whereby if the utility realizes savings from having costs fall short of projected costs, customers should be able to capture some percentage of those savings,” Malkin said. Finally, there need to be a “very clear set of performance metrics that are not focused on inputs, but on outputs.”
Today’s Real-World, Case-by-Case Models
So, where do utility regulators and industry advocates look for working examples of models like these? Let’s start with smart grid deployments, which have been an early testing ground for alternative cost-recovery and performance-based regulation in the United States.
Paul Alvarez, president of consulting firm Wired Group, named a few examples of states where regulators have allowed utilities to create rate structures for specific smart grid projects that fall outside the typical multi-year ratemaking process. “It’s kind of like pre-approval -- you go spend the money, and we’ll let you increase rates as you go about getting cost recovery,” he said.
Smart grid investments do show a significant return on investment for utilities and customers alike, according to the Department of Energy's May report on stimulus-funded smart grid projects (PDF), and other reports, such as one the Wired Group released last month, that delve into specific customer benefits. Just how to impose regulatory discipline on utilities to make sure that customers get their fair share of those returns is another matter, however, Alvarez said.
“There are more aggressive things that states have done,” he said. For example, Oklahoma and Ohio regulators have allowed utilities to add riders to customers’ bills to pay for specific smart grid projects. In exchange, however, those regulators have required utilities to subtract the expected cost reductions to be gained from the projects from those increases, as incentive to actually achieve those savings, he said. (Paul Centolella, co-author of this week’s report, is familiar with this regulatory model, having served on the Public Utilities Commission of Ohio at the time that it instituted the model for Duke Energy and AEP Ohio.)
Other states, including Maryland, California and Illinois, have imposed performance metrics on utilities, requiring them to measure and verify the cost reductions and other benefits they’re claiming for those projects. But “in most states, those are information only -- they’re not tied to any compensation,” he said. “You don’t get any extra if you do well; you don’t get any less if you do poorly.”
The exception in this case is Illinois, where state lawmakers passed a law requiring utilities Commonwealth Edison and Ameren to meet certain benchmarks for their smart grid deployments, or face reduction in their return on investment at a later date. The problem here, Alvarez said, is that the potential penalties at the end of that proving-out process don’t necessarily push the utilities to strive for the best outcomes, but only to do whatever will keep them from getting dinged during the assessment period.
“What ComEd and Ameren would argue -- and I think it’s fair -- is they have penalties for poor performance, but no upside for good performance,” he said. “The reason these negative adjustments for poor performance are so small is that there’s no opportunity for positive upside.”
All of these regulatory adjustments for specific projects also don’t get at the core problem of how to get an entire utility’s operations more aligned with the incentives and risks associated with free market competition, Malkin noted.
“It’s a step in the right direction,” he said. “But it doesn’t address the range of challenges that utilities face, and I don’t think it goes all the way toward promoting the right outcomes.”
Creating a Model for Off-Bill Costs of Future Grid Challenges
There’s another looming challenge facing this reconfiguration of risks and rewards for utility regulators, and that’s how to measure the benefits that come from utility investments that don’t show up directly on customer bills. Most critically, there’s a huge amount of uncertainty over how to quantify improvements in grid reliability and efficiency -- the very area where utilities are facing unprecedented challenges in renewable energy integration, extreme weather resiliency and cybersecurity.
“It’s very tricky,” Malkin said, “but as an industry, we’re getting more sophisticated at figuring out what the value of reliability is. At the same time, regulators are saying, now that we know the value, how do we calculate it in the context of regulation?”
Traditionally, utilities have faced the stick of penalties for missing service quality measures like SAIDI and SAIFI, which measure the frequency and duration of outages. Or, they’ve had to absorb the costs of improving system efficiency metrics such as power factor, which measures the balance of efficient delivery of electricity to different loads across distribution circuits.
But today’s regulatory models don’t really provide for a return on investments intended to improve these metrics, beyond keeping them from exceeding levels that call for penalties. Nor do today’s models allow returns on investments seeking to manage emerging problems, like how many interconnections utilities can be expected to allow for customer-owned solar systems or electric vehicle chargers, before they have to put a halt to them to avoid grid imbalances or overloaded circuits.
“The question becomes, what kind of grid does our community really want?” Alvarez said. “How much cybersecurity is enough? How much distributed generation do we want to be ready for? These are community-wide questions that need to be answered and discussed among stakeholders.”
Malkin noted that some states are moving ahead with these kinds of discussions. “The one that immediately comes to mind is Massachusetts,” he said. “When the regulator up there, the Department of Public Utilities, decided to open an investigation into grid modernization, it wasn’t to look at a particular technology like AMI, or to look at a smart grid roadmap.” Instead, “It was to pose the question we posed in this report,” he said -- what should the grid of the future look like, and how should utilities, commercial and residential customers, and all other stakeholders, be expected to share the costs and rewards of that new model?
The result of that process (PDF) has yielded a proposed regulatory model called the Utility of the Future, which is now being considered by the Massachusetts state legislature, as well as piloted in the state’s Worcester smart-grid test bed. Similar proposals, such as the “Utility 2.0” concept being developed in Maryland, are aiming at similar goals. California regulators recently held a forum to hash out such issues, where a recently passed law, AB 327, lays out big changes for state energy policy in terms of energy ratemaking, solar net metering and grid resiliency planning.
Whatever regulatory models emerge, they’re going to have to manage a whole set of grid technologies and business models that are just beginning to come into existence today. Here are some of the “implications” laid out in the GE Digital Energy/Analysis Group report:
- “Service classifications that reflect the ability to deliver higher levels of reliability in portions of the grid and that would allocate the costs to customers who value and are willing to pay for more reliable or resilient service.”
- “A balanced treatment of customer-owned generation and standby distribution rates that reflects the system and reliability benefits of integrating distributed resources into the planning and operation of networked distribution systems, but also appropriately allocates fixed distribution costs to these customers.”
- “Dynamic competitive prices that are transparent and made available in a manner that facilitates efficient automated responses by devices in customers’ homes and businesses.” If organized in a way that prevented tampering or gaming by energy market participants, energy-smart devices like these “could improve asset utilization, reduce investment requirements and customer costs, enhance system resilience, and facilitate the integration of variable renewable resources.”
“We do not expect regulators to suddenly wake up and wholeheartedly embrace and adopt this model,” Malkin said. But certainly they ought to be asking themselves, “Are utilities properly incentivized to make the investments that we as ratepayers, we as a country, think we ought to provide, and what they ought to provide in the future?”