Recent Posts:

CIA’s In-Q-Tel invests in Tendril

Rob Day: September 29, 2005, 7:00 PM
Following on the heels of the Sensicast funding announcement earlier this week, Tendril Networks announced an investment and business relationship with In-Q-Tel, the CIA's venture finance organization. Financial details weren't released.

Sensing technology has been developing rapidly (in terms of sensitivity, lower power needs, better automation, etc.), and at the same time so has machine-to-machine communications technology, in terms of hardware, software, standards development, etc. The Sensicast and Tendril announcements this week aptly illustrate the interest by funders who see opportunities with such startups to accelerate adoption of these technologies, for applications in environmental monitoring, manufacturing efficiency, energy and water distribution system automation, and other cleantech investment areas.

Sensicast, Nysa Membrane, and six months

Rob Day: September 26, 2005, 9:09 PM
  • Sensicast, which provides wireless communications for sensor networks for industrial and energy-system applications, announced a $13M round led by Global Environment Fund, along with existing investor Ardesta. Sensicast is in trials with GE and others, and is developing applications aimed specifically at boosting energy efficiency.
  • Burlington, Ontario's Nysa Membrane Technologies announced a C$2M Series A. The company promises "truly disruptive" performance from their filtration and separation technology. Initial applications appear to be largely targeted at the medical markets, but membrane technologies are often also applicable for cleantech uses, and the company is specifically targeting food and beverage separation, which can be very wasteful processes. The round was led by Golden Horseshoe Life Sciences and Emerging Technologies Venture Fund (GHFV) and the Technology Seed Investments Group of the Business Development Group of Canada (BDC), and also participating was MedInnova Partners Inc. (MPI).
  • Finally, last week marked the six month anniversary of the start of this site. It's certainly been more widely-read than I would have ever expected. According to the (somewhat incomplete and overlapping) data the various trackers provide, there have been almost 30,000 hits on the site over the past six months, or an average of 160 per day. I certainly hope this site has been helpful for cleantech investors and those interested in cleantech venture investing -- please don't hesitate to contact me with your tips, feedback, and suggestions. Thanks for visiting!

The intelligent grid pt. 3: Transmission capacity

Rob Day: September 21, 2005, 11:28 PM

More answers to the questions posed by Matt Marshall, who wrote:

“Locally, [the intelligent grid] is significant because the California Independent System Operator, which oversees most of the state's electricity system, has just approved a $300 million transmission line that brings power into S.F. from Pittsburg under the S.F. Bay. That's a whopping-big line. Sounds like the opposite of the Intelligrid, but maybe we're missing something.???

Looking at the chart of electricity prices mentioned in the last post (again, you have to scroll down on the page to see the chart), it's striking how much prices vary across the country. Why is that? Because of lack of transmission capacity. Hydro power is (once the dam is built) really cheap. Coal power is relatively cheap. Natural gas-fired power is not nearly so cheap. So areas with access to cheap power (e.g., the Pacific Northwest's hydro, or the Midwest's coal-fired plants) tend to have the lowest cost of electricity.

But if our major transmission grid worked efficiently, we would simply move power to where it's needed, from where it's cheapest to produce. And then prices would self-adjust, right? There would still be some slight variation in pricing due to losses during transmission, but they would be minor, not the 2x or more variation seen in the chart.

There are two macro problems with the current layout of the tranmission grid -- there's not enough capacity, and there's not enough connectedness.

With enough capacity, you could ship all of the cheap hydro power from the Northwest down to where it's most needed in California. Some of that does happen (and energy traders make good money off of facilitating that). But just look at the prices -- clearly, it's not done efficiently. Indeed, we don't really have one single American power grid. We have many different grids tied together at "interties" with very limited capacity. Anyone who wants to learn more about the power congestion situation should definitely check out this report from the DOE (pdf).

Besides the lack of capacity, the grid is not as well meshed as would be hoped. In an effective mesh system, when one link goes down, the load is easily apportioned out across several other links, and power still gets to where it needs to be without significant strain on the system. But no one started out building a national grid that would resemble such a perfect mesh system, so we don't have one. Instead, there are a few critical links out there that, if they go down, take a lot of the system with them. And you get what you had in LA last week, where someone cuts the wrong cable and a major city loses power.

We also need the links in the grid to be smarter and more efficient, but that's to be discussed in the next and final post of this series, where we'll describe some of the technology areas where innovative startups are addressing the needs of the intelligent grid, and finding success.

Let's go back to the question implied by Matt from above: Should we spend $300M on transmission capacity, or should we spend it encouraging more distributed renewable generation?

To which the cleantech investor says, “Yes.???

But joking aside, I’ll simply point out that solar resources and wind resources, just like hydro power resources and coal-fired generation plants, are not evenly dispersed across the country. So even if you encourage distributed, renewable energy generation, again you have the situation where power can be generated in one place a lot cheaper than it can be produced elsewhere. And it so happens that the best places for solar (e.g., deserts) and wind (e.g., offshore, and the Great Plains) aren’t really close to consumers. Indeed, some pundits say that the biggest obstacle to even more adoption of wind generation (as if it wasn’t already growing incredibly rapidly) is lack of adequate transmission capacity in locations where wind farms can be best built.

So while there’s a strong case to be made for further building out renewable distributed energy generation capacity, there’s a strong case to be made as well for building out transmission capacity and interconnectedness.

The intelligent grid pt. 2: Distributed generation and the grid

Rob Day: September 21, 2005, 9:18 PM
Following up on the first part of my answer to the questions posed by Matt Marshall, where we discussed why things change slowly at utilities and on the grid.

In his smart column, Matt describes a vision of an electricity transmission and distribution grid that is strengthened by more use of distributed generation sources (hopefully powered by renewables). As he writes:
"Instead of hundreds of power plants, such as giant 1,200 megawatt nuclear power plants, powering our needs, we should have millions of solar powered residences and workplaces -- or at least a power source closer to their destination -- so that a good fraction of the power doesn't get wasted in transportation. And that way, a single error at one plant, or cut in an electrical transmission line, doesn't shut down the power of 2 million people, as it did last month yet again in Southern California. Utilities would have to buy into idea, which arguably is against their interests -- they'd lose control."
So is the lack of an intelligent grid holding back such a vision? The answer is both yes and no.

What's holding back distributed generation?

Yes, there isn't enough use of distributed generation yet. And I would also throw demand reduction into the mix -- if the idea is to reduce our dependence on an outdated transmission and distribution grid, even better than generating power close to demand would be not needing it in the first place, or being able to reduce demand instantly in response to tight supply or problems with the grid.

Not all utilities have bought into the concept yet, it's true. Systems engineers are naturally reticent to deal with the complications that result when thousands of small generation sources start to interact with an already hard-to-control, finely-balanced grid. And some utilities have taken this view in the past and tried to make it difficult for end users to set up distributed generation systems.

It's important to realize however, that much of the policy framework is in place to enable more use of distributed generation. Across the U.S., federal law requires that anyone be able to sell such excess power at a minimum of wholesale (the price a utility pays) prices. As this chart shows, in 37 states, some form of "net metering" is allowed by law. Net metering allows anyone who generates more power than they themselves need to sell their excess power back onto the grid at retail (the price you and I pay) prices.

Now, in a lot of these states there are a lot of exclusions and limits, and some states don't allow net metering at all, so there's a lot of room for further change. But importantly, in major markets like California (solar and wind net metering allowed for all customers with generation IssueAlerts from Utilipoint, among other sources. But the above should help illuminate that, while there's certainly room for some policy shifts to encourage the intelligent grid, that's not the whole story.

Costs and awareness

What really held back distributed generation and distributed "demand response" activities were two things: The cost of the energy produced, and lack of awareness.

Most distributed generation technologies are just now getting to be price competitive with retail electricity rates, and only in the higher priced areas. As the chart on this page shows (scroll down), retail prices vary widely across the U.S. In whatever region you're located, while it certainly helps ease concerns over over-building a system and thus wasting capacity, you're really only going to build significant distributed generation capacity to sell back into the grid if the price you'll receive makes it worth it. So, for instance, in California you need to be able to make the electricity for a net cost of around 12 cents per kilowatt hour or less (yes, this is an oversimplification, but please allow the bigger point...). As this site shows, renewables are now barely reaching this level:
  • Solar is between 20-40 cents per kwh
  • Microturbines and some fuel cells are between 10-15 cents per kwh
  • Wind is between 5-10 cents per kwh, but with large-scale turbines (that few would park in their back yard)
For much of the country, these costs are still higher than retail electricity prices.

Furthermore, while net metering policies often allow such distributed generation to be sold back onto the grid at retail rates, if everyone starts doing it, utilities are going to clamor for some relief -- they do, after all, maintain the transmission and distribution grid for everyone else, and if they pay for electricity at the retail rate, and sell the electricity at the retail rate, then it gets pretty tough to pay expenses. This then bogs down in a policy and regulatory debate best left to other venues than this site. But suffice to say, it's likely that the price distributed generators will be able to get for their excess power is likely to go down over time, if distributed generation rises in usage.

But of course, the cost of power by emerging distributed generation technologies is going down rapidly, too. And in many places policies are in place to subsidize them further, as well. Which is why the market for solar, wind, fuel cells, etc. is growing so quickly.

The other real issue is awareness. Companies don't realize that there are vendors out there now who have developed systems to help them significantly reduce their energy usage at peak-demand times, without them even being able to tell that a change took place. Homeowners simply don't realize that, by taking advantage of subsidies and net metering, they can often save money in the long run by filling their roof with solar panels (e.g., 10 year payback or better, compared to a 20 year system lifetime), or by putting in another type of grid-tied generation system. Or they may feel that it is an aesthetically unappealing proposition. But this lack of awareness, too, is changing rapidly.

So with policies already somewhat in place, attitudes and awareness changing, and costs coming down rapidly, it's easy to understand why emerging distributed energy technologies are getting a lot of interest from the venture investing community. And why the intelligent grid is coming, albeit slowly.

A few quick items

Rob Day: September 20, 2005, 12:36 PM
  • Plasco Energy, which has developed plasma arc technology for waste-to-energy and coal treatment applications, announced they raised a C$7.3M round (including C$4.5M from lead funder Killick Capital) to fund facilities in Ottawa and Barcelona. A pdf of the announcement is available here.
  • Today's fun read: This article from the Montreal Gazette about attempts to create add-ons to the internal combustion engine to make it burn more efficiently. Certainly, the internal combustion engine isn't going away anytime soon.
Free registrations required for many of the above links...

The intelligent grid is alive and well… but maturing very slowly

Rob Day: September 19, 2005, 8:33 PM
Matt Marshall, of the San Jose Mercury News and SiliconBeat, asks the question (directly of us, it must be pointed out), "is [the intelligent grid] really just politically dead in the water?"

In this post and a few to follow over the next couple of days, I hope to share with readers my viewpoint that the intelligent grid is indeed alive, and that there are innovative cleantech VC-backed companies poised to make a big push forward as part of the grid's evolution, with big market growth potential -- but we should also expect it to develop very slowly, and this has significant implications for venture investors as well.

The background: Why things move slowly

In this post, let's just discuss a little background. First of all, what is the intelligent grid? EPRI's IntelliGrid effort provides a good scenario-based overview, and Matt's post discusses a vision for the intelligent grid that places more emphasis on distributed generation. Essentially, when we talk about an intelligent grid, we're talking about an electricity transmission and distribution (and to an extent, generation) system that is "smart" enough to recognize potential problems, communicate such conditions to a central decision-maker (ie: computer), and automatically correct for the problems. In the EPRI scenario, a major transmission line outage is detected, corrected for by a number of means, and adapted to until a permanent fix to the transmission line can be arranged.

Many readers may assume that this is the way our electricity grid already works. After all, we expect that when we flick a switch, our lights will come on, and (with rare but sometimes spectacular exceptions) they generally do. We know it's a complex network, therefore it must be controlled by a smart, centralized, automated decision-making system. And computers are everywhere these days, right? But the simple fact is that our electric system wasn't designed for this level of complexity and interconnectedness, it is not very well automated, and it's frankly amazing (and a credit to the transmission engineers out there) that this somewhat archaic network continues to work while being held together by bubble gum and baling wire (metaphorically speaking, of course... well, kind of).

Just to give a small tangible sense of the magnitude of the problem: For many neighborhoods around the U.S., do you know how a utility knows you have a power outage on your block? No, they don't get a signal back at "home base", much less an automated "fixit" message to work crews. Instead, they depend upon getting a telephone call from you, complaining that your lights are out. Then they know to begin looking for a problem.

Oh, and then there's the fact that the system is outdated and starting to fall apart. It's worth repeating Nancy Floyd's quote from Red Herring, also quoted by Matt:
The big infrastructure problem is the aging grid, and the whole automation area. The average age of transformers is 38 years, and their design life is 40. Almost every week transformers explode, causing outages, costing money, even killing people. And how do they find out if a transformer is going to fail? They send someone out to take a sample of oil from the transformer and send it to a lab to get results a week later.
So you can see why many have called for the development of an intelligent electrical grid. As we'll discuss in a later post, today's grid isn't set up for tomorrow's power supply. And it's just a matter of time before another bad blackout happens. And then another after that. A smart grid could help address this.

Matt posits that utilities seem to be resistant to the smart grid concept. But here's the first important thing to know about our electricity network:

Utilities are profit-maximizing, but they don't make money in the way you would expect

How does a for-profit electric utility make money? First of all, you have to understand that they are regulated by a state's public utility commission (PUC). The PUC is either directly elected by the populace, or its members are appointed by politicians -- so the PUC is highly sensitive to political forces. The PUC usually grants the utility a rate of return on assets. Then, they negotiate with the utility to see how many assets there should be. Then, they negotiate with the utility to see what the expenses should be. Finally, they determine a rate "schedule" (ie: a set of different rates for different types of customers) that allows the utility, with an established set of assets and an agreed-upon estimate of costs, to achieve their rate of return.

As you might imagine, both the PUC and the utility have a lot of various factors to consider. The PUC's job is to a) make sure that the lights stay on; but b) keep rates low. This sets up the strong incentive to keep assets and expenses as low as possible, and to make sure that there is very little "risky" activity going on that could lead to a problem. I.e.: "We don't want the utility to experiment with new technology."

At the same time, the utility's profit motive to push costs down as much as possible is somewhat undermined. Once a "rate case" has been established with the PUC, it's usually a couple of years before the next one. So if the utility can lower their expenses during that time, they might be able to sneak some extra profits in the short term. But the next time around, the PUC will just adjust their expectations of expenses down, and lower the rates accordingly, so any profit to the utility from lowering expenses is short-lived. And those lowered expenses usually mean layoffs -- for instance, if you put in an automated meter reading system so that you don't need people physically walking from meter to meter to read each household's electricity usage each quarter, that could mean big savings... because you get rid of the meter readers. Such is the price of progress in a normal industry, but in an industry that is so politically managed, it's not a very popular move. So why invest in a technology to improve reliability or otherwise reduce costs, if it's not going to gain much in terms of long-term earnings, and it's going to make you some enemies? Thus, the utility becomes reticent to try new technology, even when it would save some money.

So as you can see, both the utility and the PUC have an incentive to save costs, but mostly only in the very short-term, and certainly not to the extent that most commodity industries (hey, electrons are electrons) would experience. At the same time, they're reticent to invest in new capacity and backup plans unless and until absolutely necessary. There are very real reasons for such a system to be in place, based around the difficult economic concept of the natural monopoly, and this site is not going to address the public, political debate around the design of the rate-setting system. But from the above, you can see that it is an important factor in how things work, and how fast things change.

At the level of the transmission grid (so: larger power lines, not the neighborhood-level distribution network), it's even more complex. There are Regional Transmission Operators (RTOs), quasi-governmental organizations running their own transmission networks (e.g., the Bonneville Power Authority), energy traders, utilities both for-profit and not-for-profit, Independent Transmission System Operators (ISOs), etc., all making investment and usage decisions. You have spot markets for power, but only at critical junctions, and then you have to pay additional rates for the right to transport that power to wherever it's actually needed. And on top of all that, the lines have to maintain a fairly precise balance at all times -- you can't let the load get too high on any single link of the transmission network, or it could go out. And then you get a Northeast Blackout scenario. To get just a sense of the complexity, see how the Western Area Power Administration describes their role... Essentially, our transmission grid is an inherently chaotic, finely-balanced system, which a disparate set of decision-makers are managing (barely) to keep under control.

Which brings us to the second important piece of context:

Engineers at utilities are very slow to adopt new technology

And for good reason. Utilities are supposed to provide power when and where it's required. That's their mandate, and the reason they are granted the ability to operate as local monopolies. Therefore, best way to screw up a perfectly good 40+ year career as a dedicated utility engineer is to install a system or otherwise allow a change in the system that causes the lights to go out.

So put yourself in the shoes of an electricity transmission engineer, and imagine a small startup approaches you about putting an automated decision-making unit out in the field, controlling a critical relay point, in response to market signals or some such. Your desktop PC crashes once a week (utilities also don't invest much in IT), the startup doesn't have any other big customers yet, and yet they're asking you to hand over a part of your network to a computer that will be doing things out of your control. No thank you. Similarly, in a topic we'll discuss more in a later post, these engineers are going to be naturally reticent to enabling a bunch of random small generators (ie: solar panels on houses, fuel cells at Cisco, etc.) to be able to feed power into the grid in ways out of the control of the centralized engineers, no matter how much sense it makes at a high level.

When considered in light of the larger economic incentive problems described above, it's easy to see why a) selling a new technology to utilities is a chicken and egg problem -- they all want to be the 3rd utility to use a product, definitely not the first; and b) even when you do get a utility interested, they're going to perform a pilot project, and then another pilot project, and then another pilot project, and then a limited roll-out, etc. And thus, the sales cycle can easily take years.

Which brings us to the final point to address in this already too-long post:

Everyone admits the system has to change, but no one agrees on exactly how

The obstacle in fixing all of this is not necessarily political feasibility. Yes, PUCs don't want to have to raise rates. But they also recognize that the system is getting to be unsustainable. Governmental groups from the Western Governors' Association to the Federal Energy Regulatory Commission (FERC) to state legislators to the U.S. Congress have all recognized that we need to upgrade our network. There are a lot of competing ideas about how to do this. People are committed to making it happen. There have been some really promising leadership efforts by some ISOs and utilities.

But there is also a lot of regulatory uncertainty. Attempts were made to deregulate the network, in the hope that that would provide better incentives for utilities and operators to make investments. Ask any Californian how that worked out. So now such deregulation efforts have been drawn back. Laws passed by Congress have created new types of quasi-governmental organizations, which are still being formed and developed. In such an uncertain regulatory and political environment, I've worked with utilities who didn't want to make investments in transmission infrastructure simply because they weren't even sure at the time that the state wasn't going to be taking it from them. Will utilities be forced to sell their transmission lines to the RTOs? At what price? Again, not a topic for this site to discuss, but readers can get a sense of the somewhat paralyzing uncertainties.

At the same time, there's also a lot of technology uncertainty. Granted, it's easy to say that any improvement is a worthwhile improvement. But think back on Nancy Floyd's powerful quote from above -- these utilities are making 40 year investments in as-yet unproven technologies. And even if the technology seems proven, will another technology that comes along a year later blow it away? For example, it's relatively easy to design an electricity meter that can communicate with the utility, so that you can get automated meter reading (AMR). But what communication link are you going to use? Cellular technology? The existing phone line to the home? Broadband over powerline? If you choose the wrong one, there could be trouble both with your customers and with the PUC. And then you have to integrate your meter reading systems with your IT infrastructure. And also, you have to make sure that the system you use to roll out AMR to the first third of your territory today will work with the system you choose when you roll out meter reading technology to the last third in five years.

In such an environment of uncertainty, of course utilities are going to move very slowly. But the political will may be there to force faster changes. For example, in Ontario (the US and Canadian power grids are interconnected and similarly managed, so it's a relevant example) they are requiring smart metering across the entire province by 2010, with smart grid applications in mind. California is moving similarly. And new incentives are being put in place to encourage more investment in transmission capacity.

It's just going to take a long time.

I hope this first post on the topic provides some useful context. In following posts over the next couple of days/ weeks (things are very busy, sorry), I'll try to more directly address the topics of distributed generation and cleantech and the smart grid, and also what VC-backed companies are already doing in moving the idea of the smart grid forward. But given the question that Matt asked about political feasibility of the smart grid, I hope the above provides some context for thinking about the question and its implications for investors. This post may appear a big discouraging -- but there are a lot of exciting things happening in smart grid technology that cleantech investors should be aware of, and we'll discuss some of that. It's not a question of "if", just "when," and "how".

[Update: Here are links to parts two and three of this series of posts]