clock menu more-arrow no yes mobile

Filed under:

Is 100% renewable energy realistic? Here’s what we know.

Reasons for skepticism, reasons for optimism, and some tentative conclusions.

wind (Shutterstock)

The world has agreed to a set of shared targets on climate change. Those targets require deep (80 to 100 percent) decarbonization, relatively quickly.

What’s the best way to get fully decarbonized? In my previous post, I summarized a raging debate on that subject. Let’s quickly review.

We know that deep decarbonization is going to involve an enormous amount of electrification. As we push carbon out of the electricity sector, we pull other energy services like transportation and heating into it. (My slogan for this: electrify everything.) This means lots more demand for electricity, even as electricity decarbonizes.

The sources of carbon-free electricity with the most potential, sun and wind, are variable. They come and go on their own schedule. They are not “dispatchable,” i.e., grid operators can’t turn them on and off as needed. To balance out variations in sun and wind (both short-term and long-term), grid operators need dispatchable carbon-free resources.

Deep decarbonization of the electricity sector, then, is a dual challenge: rapidly ramping up the amount of variable renewable energy (VRE) on the system, while also ramping up carbon-free dispatchable resources that can balance out that VRE and ensure reliability.

Two potentially large sources of dispatchable carbon-free power are nuclear and fossil fuels with carbon capture and sequestration (CCS). Suffice it to say, a variety of people oppose one or both of those sources, for a variety of reasons.

So then the question becomes, can we balance out VRE in a deeply decarbonized grid without them? Do our other dispatchable balancing options add up to something sufficient?

That is the core of the dispute over 100 percent renewable energy: whether it is possible (or advisable) to decarbonize the grid without nuclear and CCS.

In this post I’m going to discuss three papers that examine the subject, try to draw a few tentative conclusions, and issue a plea for open minds and flexibility. It’ll be fun!

——

Two papers circulated widely among energy nerds in 2017 cast a skeptical eye on the goal of 100 percent renewables.

One was a literature review on the subject, self-published by the Energy Innovation Reform Project (EIRP), authored by Jesse Jenkins and Samuel Thernstrom. It looked at a range of studies on deep decarbonization in the electricity sector and tried to extract some lessons.

The other was a paper in the journal Renewable and Sustainable Energy Reviews that boasted “a comprehensive review of the feasibility of 100% renewable-electricity systems.” It was by B.P. Heard, B.W. Brook, T.M.L. Wigley, and C.J.A. Bradshaw, who, it should be noted, are advocates for nuclear power.

We’ll take them one at a time.

Most current models find that deep decarbonization is cheaper with dispatchable power plants

Jenkins and Thernstrom rounded up 30 studies on deep decarbonization, all published since 2014, when the most recent comprehensive report was released by the Intergovernmental Panel on Climate Change (IPCC). The studies focused on decarbonizing different areas of different sizes, from regional to global, and used different methods, so there is not an easy apples-to-apples comparison across them, but there were some common themes.

To cut to the chase: The models that optimize for the lowest-cost path to zero carbon electricity — and do not rule out nuclear and CCS a priori — generally find that it is cheaper to get there with than without them.

Today’s models, at least, appear to agree that “a diversified mix of low-CO2 generation resources” add up to a more cost-effective path to deep decarbonization than 100 percent renewables. This is particularly true above 60 or 80 percent decarbonization, when the costs of the renewables-only option rise sharply.

Again, it’s all about balancing out VRE. The easiest way to do that is with fast, flexible natural gas plants, but you can’t get past around 60 percent decarbonization with a large fleet of gas plants running. Getting to 80 percent or beyond means closing or idling lots of those plants. So you need other balancing options.

One is to expand the grid with new transmission lines, which connects VRE over a larger geographical area and reduces its variability. (The wind is always blowing somewhere.) Several deep decarbonization studies assume a continental high-voltage super-grid in the US, with all regions linked up. (Needless to say, such a thing does not exist and would be quite expensive.)

us super-grid
One conceptual example of a US-wide supergrid, from AWEA.
(AWEA, via Wikipedia)

The other way to balance VRE is to maximize carbon-free dispatchable resources, which include dispatchable supply (power plants), dispatchable demand (“demand management,” which can shift energy demand to particular parts of the day or week), and energy storage, which acts as both supply (a source of energy) and demand (a way to absorb it).

Energy storage and demand management are both getting better at balancing out short-term (minute-by-minute, hourly, or daily) variations in VRE.

But there are also monthly, seasonal, and even decadal variations in weather. The system needs to be prepared to deal with worst case scenarios, long concurrent periods of high cloud cover and low wind. That adds up to a lot of backup.

We do not yet have energy storage at anything approaching that scale. Consider pumped hydro, currently the biggest and best-developed form of long-term energy storage. The EIRP paper notes that the top 10 pumped-hydro storage facilities in the US combined could “supply average US electricity needs for just 43 minutes.”

Currently, the only low-carbon sources capable of supplying anything like that scale are hydro, nuclear, and (potentially) CCS.

So if you take nuclear and CCS off the table, you’re cutting out a big chunk of dispatchable capacity. That means other dispatchable resources have to dramatically scale up to compensate — we’d need a lot of new transmission, a lot of new storage, a lot of demand management, and a lot of new hydro, biogas, geothermal, and whatever else we can think of.

Even with tons of new transmission, we’ll still need a metric shit-ton of new storage. Here’s a graph for comparison:

storage needs (EIRP)

The US currently has energy storage capacity for around an hour of average electricity consumption. Only 15 weeks, six days, and 23 hours to go!

Suffice to say, that would mean building a truly extraordinary amount of energy storage by mid-century.

It gets expensive, progressively more so as decarbonization reaches 80 percent and above. Trying to squeeze out that last bit of carbon without recourse to big dispatchable power plants is extremely challenging, at least for today’s models.

Thus, models that optimize for the lowest-cost pathway to deep decarbonization almost always include lots of dispatchable power plants, including nuclear and CCS.

“It is notable,” the review says, “that of the 30 papers surveyed here, the only deep decarbonization scenarios that do not include a significant contribution from nuclear, biomass, hydropower, and/or CCS exclude those resources from consideration a priori.”

To summarize: Most of today’s models place high value on large dispatchable power sources for deep decarbonization, and it’s difficult to muster enough large dispatchable power sources without nuclear and CCS.

100 percent renewables hasn’t been 100 percent proven feasible

The second review takes a somewhat narrower and more stringent approach. It examines 24 scenarios for 100 percent renewable energy with enough detail to be credible. It then judges them against four criteria for feasibility:

(1) consistency with mainstream energy-demand forecasts; (2) simulating supply to meet demand reliably at hourly, half-hourly, and five-minute timescales, with resilience to extreme climate events; (3) identifying necessary transmission and distribution requirements; and (4) maintaining the provision of essential ancillary services.

(“Ancillary services” are things like frequency regulation and voltage control, which keep the grid stable and have typically been supplied by fossil fuel power plants.)

Long story short, none of the studies passed these feasibility tests. The highest score was four points out of a possible seven.

The authors conclude that “in all individual cases and across the aggregated evidence, the case for feasibility [of 100 percent renewable energy] is inadequate for the formation of responsible policy directed at responding to climate change.”

That is the peer-reviewed version of a sick burn.

Note, though, that these are pretty tough criteria: Researchers model a full electricity system, responsive to both short-term and long-term weather variations, meeting demand that is not appreciably different from mainstream projections, providing all needed services reliably, using technologies already demonstrated at scale.

That’s not easy! It’s reasonable to ask whether we need that much confidence to begin planning for long-term decarbonization. If any new system must demonstrate in advance that it is fully prepared to substitute for today’s system, it’s going to be difficult to change the system at all.

(Renewables advocates might say that nuclear advocates have a vested interest in keeping feasibility criteria as strict and tied to current systems as possible.)

For more in this vein, see “A critical review of global decarbonization scenarios: what do they tell us about feasibility?” from 2014, and here for more.

The question is how much our current decision-making should be constrained by what today’s models tell us is possible in the distant future.

Energy experts are more optimistic than their models

A third paper worth mentioning is 2017’s Renewables Global Futures Report (GFR) from global renewable-energy group REN21. In it, they interviewed “114 renowned energy experts from around the world, on the feasibility and challenges of achieving a 100% renewable energy future.”

There’s a ton of interesting stuff in the report, but this jumps out:

energy experts on 100% RE (REN21)

That’s 71 percent who agree that 100 percent renewables is “reasonable and realistic.” Yet the models seem to agree that 100 percent renewables is unrealistic. What gives?

Models are only models

It pays to be careful with literature reviews. They are generally more reliable than single studies, but they are exercises in interpretation, colored by the assumptions of their authors. And there’s always a danger that they are simply compiling common biases and limitations in current models — reifying conventional wisdom.

There are plenty of criticisms of current models of how climate change and human politics and economics interact. Let’s touch on a few briefly, and then I’ll get to a few takeaways.

1) Cost-benefit analysis is incomplete.

Models that “minimize cost” rarely minimize all costs. They leave out many environmental impacts, along with more intangible social benefits like community control, security, or independence.

UC Berkeley’s Mark Delucchi, occasional co-author with Stanford’s Mark Jacobson of work on 100 percent WWS (wind, water, and sun — see more about that at the Solutions Project), says that the ideal analysis of deep decarbonization would involve a full cost-benefit analysis, taking all effects, “the full range of climate impacts (not just CO2), air-quality benefits, water-quality benefits, habitat destruction, energy security — everything you can think of,” into account. No one, he said, has done that for getting above, say, 90 percent WWS.

“My own view,” he told me, “which is informed but not demonstrated by my work on 100% WWS, is that the very large environmental benefits of WWS probably make it worth paying for close to — but not quite — a 100% WWS systems. The ‘not quite’ is important, because it does look to me that balancing supply and demand when you get above 90-95% WWS (for the whole system) starts to get pretty expensive.”

In other words, full cost-benefit analysis is likely to offset higher renewables costs more than most models show.

2) Most models are based on current markets, which will change.

“Our traditional energy models are pretty clearly biased against a 100% renewable outcome,” Noah Kaufman told me. He worked on the “US Midcentury Strategy for Deep Decarbonization,” which the US government submitted to the UNFCCC in November 2016 as a demonstration of its long-term commitment to the Paris climate process. “Models like to depict the system largely as it exists today, so of course they prefer baseload replacing baseload.”

(Kaufman cautions that while current models may underestimate renewables, he doesn’t believe we know that with enough certainty “to mandate those [100% renewable] scenarios.”)

Price analyses based on current wholesale energy markets will not tell us much about markets in 20 or 30 years. VRE is already screwing up wholesale markets, even at relatively low penetrations, because the incremental cost of another MW of wind when the wind is blowing is $0, which undercuts all competitors.

Wholesale power markets will not survive in their current form. Markets will evolve to more accurately value a wider range of grid services — power, capacity, frequency response, rapid ramping, etc. — allowing VRE and its complements to creep into more and more market niches.

Financing will evolve as well. As it gets cheaper, VRE and storage start looking more like infrastructure than typical power plant investments. Almost all the costs are upfront, in the financing, planning, and building. After that, “fuel” is free and maintenance costs are low. It pays off over time and then just keeps paying off. Financing mechanisms will adapt to reflect that.

3) Most models do not, and cannot, model emerging solutions or current costs.

Most energy models today do not account for the full complement of existing strategies to manage and expand VRE — all the different varieties of storage, the growing list of demand-management tools, new business models and regulations — so they neither are, nor claim to be, definitive.

“I don’t want to overstate or improperly extract conclusions from my work,” NREL’s Bethany Frew, who co-authored one of the key studies in the EIRP review, cautions, “I didn’t look at an exhaustive set of resources.”

Models today cannot capture the effects of technologies and techniques that have not yet been developed. But this stuff is the subject of intense research, experimentation, and innovation right now.

It is viewed as irresponsible to include speculative new developments in models, but at the same time, it’s a safe bet that the energy world will see dramatic changes in the next few decades. Far more balancing options will be available to future modelers.

In a similar vein, as energy modeler Christopher Clack (formerly of NOAA) told me, it can take two or three years to do a rigorous bit of modeling. And that begins with cost estimates taken from peer-reviewed literature, which themselves took years to publish.

The result is that models almost inevitably use outdated cost estimates, and when costs are changing rapidly, as they are today, that matters.

Speaking of which…

4) Models have always underestimated distributed energy technology.

As I described in detail in this post, energy models have consistently and woefully underestimated the falling costs and rapid growth of renewable energy.

The professional energy community used to be quite convinced that wind and solar could play no serious role in the power system because of their variability. Then, for a long time, conventional wisdom was that they could provide no more than 20 percent of power before the grid started falling apart.

That number has kept creeping up. Now CW has it around 60 percent. Which direction do you suppose it will go in the next few decades?

It’s a similar story with batteries and EVs. They keep outpacing forecasts, getting cheaper and better, finding new applications. Is there any reason to think that won’t continue?

Which brings us to…

5) Pretending we can predict the far future is silly.

Predicting the near future is difficult. Predicting the distant future is impossible. Nothing about fancy modeling makes it any less impossible.

Modelers will be the first to tell you this. (Much more in this old post from 2014.) They are not in the business of prediction; they aren’t psychics. All they do is construct elaborate if-then statements. If natural gas prices do this, solar and wind prices do that, demand does this, storage does that, and everything else more or less stays the same … then this will happen. They are a way of examining the consequences of a set of assumptions.

Are the assumptions correct? Will all those variables actually unfold that way in the next 20, 30, 40 years? Ask any responsible modeler and they will tell you: “Eff if I know.”

Long-term energy modeling was more tractable when the energy world was mostly composed of very large technologies and projects, with a small set of accredited builders and slow innovation cycles. But as energy and its associated technologies and business models have gotten more and more distributed, innovation has become all the more difficult to even track, much less predict.

Because distributed energy technologies are smaller than big power plants, they iterate faster. They are more prone to complex interactions and emergent effects. Development is distributed as well, across hundreds of companies and research labs.

Energy is going to bend, twist, and accelerate in unpredictable ways even in the next few years, much less the next few decades. We really have no friggin’ idea what’s going to happen.

The lessons to take from all this

Okay, we’ve looked at some of the literature on 100 percent renewables, which is generally pretty skeptical. And we’ve covered some reasons to take the results of current modeling with a grain of salt. What should we take away from all this? Here are a few tentative conclusions.

1) Take variability seriously.

One reason everyone’s so giddy about renewable energy is that it’s been pretty easy to integrate it into grids so far — much easier than naysayers predicted.

But one thing models and modelers agree on is that variability is a serious challenge, especially at high VRE penetrations. As VRE increases, it will begin to run into technical and economic problems. (Read here and here for more.) California is already grappling with some of these issues.

Getting deep decarbonization right means thinking, planning, and innovating toward a rich ecosystem of dispatchable resources that can balance VRE at high penetrations. That needs to become as much a priority as VRE deployment itself.

2) Full steam ahead on renewable energy.

We have a solid understanding of how to push VRE up to around 60 percent of grid power. Right now, wind and solar combined generate just over 5 percent of US electricity. (Nuclear generates 20 percent.)

The fight to get 5 percent up to 60 is going to be epic. Political and social barriers will do more to slow that growth than any technical limitation, especially in the short- to mid-term.

This is likely why the energy experts interviewed by REN21, though they believe 100 percent renewables is “reasonable and realistic,” don’t actually expect it to happen by mid-century.

experts (REN21)

It will be an immense struggle just to deploy the amount of VRE we already know is possible. If we put our shoulder to that wheel for 10 years or so, then we can come up for air, reassess, and recalibrate. The landscape of costs and choices will look very different then. We’ll have a better sense of what’s possible and what’s lacking.

Until then, none of these potential future limitations are any reason to let up on the push for VRE. (Though there should also be a push for storage and other carbon-free balancing options.)

3) Beware natural gas lock-in.

The easy, default path for the next several years will be to continue to lean on natural gas to drive down emissions and balance VRE. And sure enough, there’s a ton of natural gas “in the queue.”

in the queue (Christopher Clacik, using EIA data)

But leaning too hard on natural gas will leave us with a ton of fossil fuel capacity that we end up having to shut down (or leave mostly idle) before the end of its useful life. That will be an economically unfortunate and politically difficult situation.

We need to start thinking about alternatives to natural gas, today.

4) Keep nuclear power plants open as long as possible.

Clack told me something intriguing. He said that there is enough nuclear capacity in the US today to serve as the necessary dispatchable generation in an 80 percent decarbonized grid. We wouldn’t need any big new nuclear or CCS power plants.

It would just mean a) changing market and regulatory rules to make nuclear more flexible (it largely has the technical capacity), and b) keeping the plants open forever.

Obviously those plants are not going to stay open forever, and the ones that are genuinely unsafe should be shut down. And Clack’s models are only models too, not gospel.

But what’s clear is that, from a decarbonization perspective, allowing a nuclear power plant to close (before, say, literally any coal plant) is a self-inflicted wound. It makes the challenges described above all that much more difficult. Every MW of dispatchable, carbon-free power capacity that is operating safely should be zealously guarded.

5) Do relentless RD&D on carbon-free dispatchable resources, including nuclear.

We know we will need a lot of dispatchable carbon-free resources to balance out a large share of VRE.

Storage and demand management can play that role, and in any scenario, we will need lots of both, so they should be researched, developed, and deployed as quickly as possible.

But large-scale, carbon-free dispatchable generation will help as well. That can be hydro, wave, tidal, geothermal, gas from waste, renewable gas, or biomass. It can also be nuclear or CCS.

I personally think fossil fuel with CCS will never pass any reasonable cost-benefit analysis. It’s an environmental nightmare in every way other than carbon emissions, to say nothing of its wretched economics and dodgy politics.

But we’re going to need CCS regardless, so we might as well figure it out.

Current nuclear plants have proven uneconomic just about everywhere they’ve been attempted lately (except, oddly, South Korea) and there is no obvious reason to favor them in their market battle with renewables.

But it is certainly worth researching new nuclear generation technologies — the various smaller, more efficient, more meltdown-proof technologies that seem perpetually on the horizon. If they can make good on their promise, with reasonable economics, it would be a blessing. (See Brad Plumer’s piece on radical nuclear innovation.)

Basically, research everything. Test, experiment, deploy, refine.

6) Stay woke.

Above all, the haziness of the long-term view argues for humility on all sides. There’s much we do not yet know and cannot possibly anticipate, so it’s probably best for everyone to keep an open mind, support a range of bet-hedging experiments and initiatives, and maintain a healthy allergy to dogma.

We’ve barely begun this journey. We don’t know what the final few steps will look like, but we know what direction to travel, so we might as well keep moving.