Renewables Aren’t Making Much Headway

sunshine hours

World consumption of primary energy 2017

And don’t forget: Wood and wood products accounted for almost half (45 %) of the EU’s gross inland energy consumption of renewables in 2016.

View original post

Germany’s Green Transition has Hit a Brick Wall

Sierra Foothill Commentary

Guest Blogger at Watts Up With That

Even worse, its growing problems with wind and solar spell trouble all over the globe.

Editor: As the Progressive Democrats force California to depend on Green Power their mistaken environmental dreams will hit the same wall that is described in this post.

Oddvar Lundseng, Hans Johnsen and Stein Bergsmark

More people are finally beginning to realize that supplying the world with sufficient, stable energy solely from sun and wind power will be impossible.

Germany took on that challenge, to show the world how to build a society based entirely on “green, renewable” energy. It has now hit a brick wall. Despite huge investments in wind, solar and biofuel energy production capacity, Germany has not reduced CO2 emissions over the last ten years. However, during the same period, its electricity prices have risen dramatically, significantly impacting factories, employment and poor families.

Germany has installed…

View original post 1,253 more words

Powering the Tesla Gigafactory

Reblogged from Euan Mearns’ Energy Matters:

Tesla has repeatedly claimed in publications, articles and tweets from Elon Musk that its Reno, Nevada Gigafactory will be powered 100% by renewables.  Specifics on exactly how Tesla plans to do this are sparse, but the data that are available suggest that Tesla’s 70MW rooftop solar array won’t come close to supplying the Gigafactory’s needs and that the other options that Tesla is now or has been considering (more solar, possibly wind, battery storage) will not bridge the gap. As a result the Gigafactory will probably end up obtaining most of its electricity from the Nevada grid, 75% of which is presently generated by fossil fuels.

Lest there be any doubt about Tesla’s claim that the Gigafactory will be powered with 100% renewables, here are some tweets from Mr Musk:

July 27, 2016: Should mention that Gigafactory will be fully powered by clean energy when complete

June 8, 2018: Gigafactory should be on 100% renewable energy (primarily solar with some wind) by next year. Rollout of solar has already begun

August 25, 2018: Tesla’s Gigafactory will be 100% renewable powered (by Tesla Solar) by end of next year

Plus this excerpt from the January 2016 Gigafactory tour handout:

(The Gigafactory) is an all-electric factory with no fossil fuels (natural gas or petroleum) directly consumed. We will be using 100% sustainable energy through a combination of a 70 MW solar rooftop array and solar ground installations.

Plus this one from Tesla’s “press kit”:

The Gigafactory is designed to be a net-zero energy factory upon completion. It will not consume any fossil fuels – there is no natural gas piped to the site nor are there permanent diesel generators being used to provide power … The entire roof of the Gigafactory will be covered in solar array, and installation is already underway. Power not consumed during the day will be stored via Tesla Powerpacks for use when needed.

According to Mr. Musk’s latest (August 25th) tweet the Gigafactory will be 100% renewable-powered by the end of next year, and it’s likely that the only renewable energy Tesla will be generating by the end of next year will come from its 70 MW rooftop solar array, which is currently a work in progress (Figure 1). So we will look at the rooftop array first.

Figure 1: Gigafactory as of November 30, 2018, with solar panels covering approximately 10% of the roof

To obtain an estimate of what the output from the 70 MW rooftop array might be I went again to Sunny Portal and downloaded the data from four operating solar arrays in Reno. The results are summarized in Figure 2. Capacity factors range from ~8% in January to ~27% in July with an annual mean of 18.6%. The summer/winter range is larger than might be expected for the latitude (39N) because Reno is more cloudy in the winter than in the summer:

Figure 2: Capacity factors of four solar PV arrays in Reno. The black line is the mean capacity factor when all four stations were operating

With an annual capacity factor of 18.6% Tesla’s 70MW rooftop array would generate 113 GWh/year at an average power output of 13MW. But what is Tesla’s capacity factor likely to be? This is an interesting question. As shown in Figure 3 Tesla’s panels are angled in opposing directions, with one line facing east and the next west. This will tend to flatten out daily generation but will also result in less total generation than would be achieved if all the panels were pointed in the optimum direction (about 30 degrees from the horizontal facing south at this latitude):

Figure 3: View of Tesla’s solar panels facing (I believe) east

On the other hand, mechanical devices attached to the panels suggest that they may be single-axis trackers (Figure 4). If so this will significantly increase the capacity factor:

Figure 4: Blowup of part of Figure 3

The panels will also presumably be the last word in efficiency and will, one hopes, be properly maintained.

Making allowance for these factors, and feeling generous, I have assumed a capacity factor of 25%. At this level Tesla’s 70MW of panels will generate 166 GWh/year at an average power output of 17.5 MW.

The question now becomes, how much energy will the Gigafactory consume? I have found two estimates:

A battery factory of that size is estimated to consume 100 megawatts (MW) of power at peak capacity or 2,400MWh per day, according to Navigant Research.

If the Gigafactory produces 105 GWh of cells and 150 GWh of packs per year*, then the factory would consume between 38 and 109 MW, or between 333 and 958 GWh per year .

* This is Tesla’s planned total capacity as of August 2018.

The 109 MW high and 38 MW low estimates reflect uncertainties regarding how energy-efficient Tesla’s manufacturing processes will be, but for the purposes of this review I am assuming that the final number will be somewhere between these two widely-separated estimates.

Another consideration is that Tesla has already recognized that its 70MW rooftop array will not generate enough electricity to power the Gigafactory. As noted above the average annual power delivery from the rooftop array will be 17.5 MW, less than half the 38MW minimum requirement. Tesla’s Chief Technical Officer, JB Straubel , acknowledged that the rooftop array was too small in a recent speech at the University of Nevada:

… the most visible thing we are doing is covering the entire site with solar power. The whole roof of the Gigafactory was designed from the beginning with solar in mind …. But that’s not enough solar, though. So we have also gone to the surrounding hillsides that we can’t use for other functions and we’re adding solar to those.

How much solar will Tesla have to add? To support 38 MW of power at a 25% capacity factor it will need approximately 150 MW of solar PV capacity (i.e. add 80 MW) and to support 109 MW approximately 440 MW (i.e. add 370 MW).

As always, however, the problem is that solar will not supply continuous power to the plant. Daytime solar generation will have to be stored for re-use at night, but this doesn’t require that much storage. What does is storing summer surplus generation for re-use in the winter. Figure 5 shows two plots that compare monthly solar generation with Gigafactory demand for the 38 MW and 109 M cases. Solar output is a smoothed curve based on Figure 2 data and is adjusted so that total annual solar generation matches annual Gigafactory demand, which is assumed to be constant:

Figure 5: Solar surpluses and deficits, 38 MW and 109 MW cases

How much storage will be required to smooth out these surpluses and deficits so as to fill Gigafactory requirements year-round? I’m not going to attempt to estimate costs because I don’t know how much Tesla would charge itself for its own batteries, but expressed in terms of the time necessary to manufacture them this is what we get:

* To achieve constant delivery of 38 MW of power Tesla would need approximately 50 GWh of storage, or 4 months of Gigafactory storage battery production. (It’s not clear how much of Tesla’s production will be storage batteries, but I have assumed 150 GWh/year).

* To achieve constant delivery of 109 MW of power Tesla would need approximately 140 GWh of storage, or over eleven months of Gigafactory storage battery production

I somehow don’t see Tesla sacrificing months of battery pack sales to back up its solar arrays.

The remaining question is how much wind and geothermal might contribute. Nevada is not very windy, and as a result the state has only 150 MW of installed wind capacity (compared to 20,000 MW in the UK, which covers about the same land area). Tesla publishes artistic renditions showing a wind farm off to the side of the Gigafactory (Figure 6) but has not announced any plans to build it. There’s also no guarantee that adding intermittent wind to solar would lower storage requirements. It might even increase them.

Figure 6:Tesla rendition of completed Gigafactory, wind turbines to the right

Which leaves geothermal. Nevada is comparatively rich in geothermal resources, although geothermal makes up only a small part of the state’s installed capacity (750MW according to the EIA). Yet steady geothermal baseload power is just what the Gigafactory needs, and it’s curious that Tesla is going for solar instead. What are the reasons for this?

One is that Tesla has apparently chosen not to use the approach used by other companies that claim to have gone 100% renewable, which is to purchase enough power from a distant renewable plant to cover their annual electricity consumption through Guarantees Of Origin or Renewable Energy Certificates and pretend that this makes them 100% renewable even though they continue to depend on fossil fuel power from the local grid. Apple and Google, discussed in this 2017 post, are examples. Instead, Tesla evidently plans to supply all of the Gigafactory’s electricity needs from its own dedicated plants in the near vicinity. And since there have been no reports of a geothermal resource at or near the Gigafactory I have to assume that there isn’t one, meaning that geothermal is out.

But why should Tesla adopt this approach? I venture to suggest that it’s because Elon Musk sincerely believes that his batteries can handle any problems posed by intermittent generation from the Gigafactory’s solar arrays. In fact, he seems to believe that his batteries can transition the world to renewables all by themselves. As he told Leonardo DiCaprio in the 2016 National Geographic documentary “Before the Flood”

“We actually did the calculations to figure out what it would take to transition the whole world to sustainable energy… and you’d need 100 gigafactories”.

It would probably be a good idea for Mr. Musk to confirm that he can keep the lights on at Gigafactory #1 before he starts work on Gigafactory #2.

‘Sun in a box’ would store renewable energy for the grid

Tallbloke's Talkshop

This may or may not have its uses, but any idea that the whole world could get electricity mainly from the sun and the wind is not credible, with today’s technology at least.

MIT engineers have come up with a conceptual design for a system to store renewable energy, such as solar and wind power, and deliver that energy back into an electric grid on demand, says TechExplore.

The system may be designed to power a small city not just when the sun is up or the wind is high, but around the clock.

The new design stores heat generated by excess electricity from solar or wind power in large tanks of white-hot molten silicon, and then converts the light from the glowing metal back into electricity when it’s needed.

The researchers estimate that such a system would be vastly more affordable than lithium-ion batteries, which have been proposed…

View original post 491 more words

Unplug Electric Vehicle Subsidies And Let Consumers Decide

PA Pundits - International

By Nicolas Loris ~

Frustrated with General Motors Co.’s recent announcement of plant closures and layoffs, President Donald Trump said the administration is now looking at cutting at subsidies to the automaker, including for electric vehicles.

Good. Families should be empowered to purchase the car they want without nudging from Washington and the financial help of their fellow taxpayers.

“The subsidies accrue to carmakers and America’s wealthiest households, which can also afford an electric vehicle without any help from other taxpayers,” writes Nicolas Loris. (Photo: EXTREME-PHOTOGRAPHER/Getty Images)

Electric vehicle handouts subsidize the wealthiest Americans and, despite their being advertised as a more “climate-friendly” option, they produce next to no climate benefit for the planet.

Trump does not quite have the power to cut GM’s current electric vehicles subsidies full stop. But he could play an important role in the future of the targeted tax subsidy.

Both federal and state governments…

View original post 684 more words

Vehicle Electrification Common Sense

From Watts Up With That:

Guest Blogger /

By Rud Istvan,

This is the first of two loosely related technology posts that ctm suggested might be interesting to WUWT. In full disclosure, the details stem from my financial interests in energy storage materials and related topics, having spent much time and money since 2007 on fundamental now globally issued energy storage materials patents for supercapacitance (the Helmholtz double layer physics that creates lightning in thunderstorms). Some of the info cited below is slightly dated because I was too lazy to make everything current. Some of this info was borrowed from my ebook The Arts of Truth and from a 2017 Climate Etc post. All conclusions nevertheless remain valid.

This post’s message (the abstract, if this were a normal clisci peer reviewed paper) is simple. Hybrid vehicles make economic and ‘climate’ sense. Plug ins may or may not depending on their architecture. Full electric vehicles (EVs) make neither economic nor climate sense.


There are various levels of vehicle electrification, so some definitions are needed. Hybrids all involve some degree of electrification of an otherwise fossil fueled vehicle. There are three generally accepted levels:

1. Simple engine off at idle, aka start/stop. This is not as technically easy as it sounds, since hydraulic fluid coupled automatic transmissions must be fully redesigned and starter batteries beefed up. Depending on drive circumstances, idle off can save about 5% fuel efficiency.

2. Regenerative braking, where the vehicle’s kinetic energy is recaptured to electrical storage and then reused in some fashion rather than dissipated as heat. Depending on vehicle size/weight and drive circumstances, regen braking can save about 7-9% fuel efficiency. Combined with idle off it is commonly known as mild hybridization, and typically cited mild hybrid values are something less than 15% net fuel efficiency gain. (There aren’t a lot of milds out there to provide real data.)

3. Full hybridization, which includes idle off, regen braking, and electric acceleration assist (plus some degree of electric only slow speed short distance motoring). Full hybrid fuel efficiency gains can be as high as 35-45%. Prius is the best known. Full details follow.

Then there are Plug in Hybrids (misleadingly aka PHEV), which can motor for some significant distance under battery alone. These come in two basic architectures. One is an ordinary full hybrid with a different or bigger battery, like the Prius Prime. The other is actually a range extended electric vehicle (not a true hybrid), like the Chevy Volt. The idea is to remove EV range anxiety, since a gasoline engine kicks in only when the battery is nearly exhausted. Details follow.

Then there are true electric vehicles like the Chevy Bolt or Tesla models. These operate on battery electric power alone, must be recharged from the grid, and commonly present ‘range anxiety’ for some subset of ordinary car use.

This post develops common sense conclusions for the following practical economic and environmental categories/cases:

-Start/Stop may make sense for both cases, but Milds do not;

-Full Hybrids almost always make sense for both cases;

-Plug Ins do or don’t make sense depending on the architecture;

-EVs never make sense for either case.


Simple start/stop makes economic and environmental sense by itself when the automatic transmission technology is changed from hydraulic fluid coupling to electronic dual clutch mechanical transmissions (DCT). Ford has announced that by 2019 all Ford transmissions (including pickups) will be DCT (which can simulate manual). Even without start/stop, the DCT alone gains 5-8% fuel efficiency by eliminating hydraulic fluid coupling losses. With a beefed up starter battery enabling start/stop, the full fuel efficiency savings are 10-13% while the incremental cost is minimal, maybe $100 for a beefier starter battery.

Mild hybridization has been tried several times, but it has almost never worked economically. There are two problems: a battery capable of accepting regen charging energy is pretty big if having acceptable vehicle life, and the extra machinery for using that electrical energy for whatever purpose. The only present commercial mild system is Valeo (a belt driven bigger combined starter/alternator for both regen and traction boost, plus a supercap plus PbA ‘hybrid’ storage system). Valeo’s system is only on a few of Peugeot’s Citroen diesels in Europe.

Full hybridization like the Toyota Prius or my 2007 Ford Hybrid Escape [i] works in several synergistic ways to improve fuel efficiency, and makes more economic sense in larger vehicles. (Note, in 2007, both hybrid technologies were identical, just scaled to different vehicles. Ford traded its European small diesel technology to Toyota in return for the Toyota Prius hybrid technology, no cash exchanged nor royalties owed.)

Full hybrid idle-off saves ~5% depending on traffic. Regenerative braking saves another ~7-9% depending on traffic. The additional power and torque of the electric motor enables two further major savings. First, the internal combustion engine (ICE) can be downsized, saving both weight and fuel. My AWD Escape hybrid uses a small 1.5L I4 engine yet is functionally comparable to the heavier AWD Escape V6. Second, the ICE can be converted from the Otto cycle to the Atkinson cycle. Atkinson ICE saves about 20% in fuel economy, but at the expense of significant torque loss. (Typical Otto ICE vehicles are ~26-30% thermally efficient, the lower number from regular gas compression ratios, the higher from premium gas compression ratios. Higher octane rating enables higher compression ratios and more efficiency.) The newest Prius I4 5th generation 2018 Atkinson ICE gets an incredible 37% thermal efficiency on regular! Atkinson ICE torque loss doesn’t matter in a full hybrid; the electric machine provides more than the lost torque. The 2018 Prius family gets combined 52MPG. It couples a 95 HP 1.8L Atkinson I4 with a 71 HP electric motor for a total of 192 HP in a mid size sedan.

There are two 2018 Prius battery choices. All models except the Prime use NiMH, same as my Escape and as Prius from its 2000 launch. The Prius Prime is their Plug In. No different than the other 2018 models in any respect EXCEPT a lithium ion battery (LIB), onboard charging, and a different battery control software scheme. To get >10 year >100,000 miles life NiMH needs to be floated between about 45% and 55% state of charge (SoC). It is only possible to motor a couple of miles at speeds under 20MPH before the engine kicks in so the alternator can recharge the NiMH traction battery. LIB allows the Plug In Prius Prime to motor 25 miles at any speed before the ICE kicks in. Prime 240V recharge time is just 2 hours. Warranty is 10 years or 100,000 miles, same as the NiMH non-plug in versions. Toyota’s only real incremental Prime costs are the incremental LIB over NiMH and associated onboard AC/DC charging electronics. Yet Toyota charges a $3,100 Prime premium (starting Prime 2018 MSRP $27,300). Makes sense for Toyota, and for enviro customers who want plug in cache. Whether it makes climate sense is a question explored below using the Volt as the example.

Prius comfortably seats 5 along with 24.6 cubic feet (cf) of cargo space (or 65cf with the rear seat folded down). Range is 633 miles from ~52 mpg. 2018 price is ≥$24,200 depending on model and trim. Toyota unsurprisingly sold ~1,170,000 Prius from 2010 (year of Volt introduction) through yearend 2015.

Now compare the alternate architecture, a range extended EV like the Chevy Volt. The 2016 Volt is powered by two electric motors providing only 149 HP, fed from a 18.4 Kwh LIB providing a marketed ~50 mile EV only range, twice that of the 2018 Prius Prime. The original all-electric range was chosen because about 2/3 of US urban trips are under 40 miles. With a 240V charger, Volt recharging takes 4.5 hours (with 120V charging, it takes 13 hours). The battery is warrantied for only 8 years or 100,000 miles. The LIB battery weights 405# (189kg) and is a 5.5 foot long T shaped monster. The range extending gasoline engine is a 1.5 liter 101HP I4 driving an onboard 54 Kw generator. With a full tank of gas and a fully charged battery, Volt range is ~408 miles. Seating is essentially only 4, and cargo capacity is only 10.6cf. For those middling vehicle values compared to Prius Prime the MSRP is ≥$33170. Unsurprisingly, Chevy has only sold about 117,000 Volts from 2010 launch through YE 2015 (the same time frame as Prius sales above, so a fair comparison). The comparable sales data say the Volt does not make much economic sense.

Do plug ins make environmental sense? Lets take the Volt, because it is more reliant on the generation grid.

EPA fuel economy ratings are required by law to be prominently placed on all new vehicles for sale in the US. This familiar sticker provides three numbers: city, highway, and combined (55/45) mpg.

Ambiguity arises from the changed plug in meaning of ‘miles per gallon’. Plug in range extended EVs like the Chevy Volt operate partly on a battery recharged from the grid, so no gallons for those miles. Volt gets a combined 37mpg in extended range mode using its gasoline engine to generate electricity. If a Volt never traveled more than about 40 miles before being recharged from the grid, its engine would never start and it would never use any gallons of gasoline. Its combined miles per gallon would be very ambiguous since division by zero is mathematically undefined.

To solve this very fundamental problem the EPA did two things. First, they calculated an energy equivalent 93 MPGe for electric ‘no gallons’ mode. We shall see that this equivalence is based on faulty assumptions. Then they explicitly assumed the Volt travels about 45% on battery alone, giving a weighted average of 60 MPGe. Except in environmental reality the Volt cannot possibly get that ‘official’ EPA mileage.


One gallon of automotive gasoline contains about 132 megajoules of heat energy. Volt’s combined ‘extended range’ (using its engine/generator) 37 MPG rating is about (132/37) 3.6 megajoules/mile. One KWh is also 3.6 megajoules; the gasoline rating is equivalent to 1 KWh/mile. This of course includes the engine/generator’s thermal losses, which are proven by the Volt’s exhaust and radiator.

The EPA sticker also says the Volt gets 36 KWh per 100 miles when the battery is powering the Volt’s electric motors! That is only 0.36 KWh/mile, 2.8 times the efficiency from the same electric motors! This discrepancy proves that the EPA MPGe rating does not include the fact that grid electricity generation is on average about 45% efficient (mixed now about half and half coal at 34% and CCGT at 61%), with up to 10% of that lost in transmission and another 10% or so in distribution. Power plants have smokestacks and cooling towers just like Volts have exhausts and radiators. Correcting for the laws of thermodynamics (which were only applied to Volt’s extended range mode), the Volt operates in battery mode about (.36/[0.45*0.8]) 1KWh/mile in comparable net energy/emissions equivalents. Of course moving the car takes the same energy in either gas or battery mode; Volt’s electric motors don’t care about their source of electricity.

EPA’s battery MPGe should be reduced to account for the thermal losses in generating and distributing grid electricity, since these were included in the 37mpg gasoline rating. The true energy equivalent battery mode is about (93*.45*.8) 33.5 MPGe. No surprise that this is even lower than 37 MPG using gasoline. Charging and discharging the Volt battery is inefficient, causing additional energy losses; the Volt battery is liquid cooled and has its own radiator partition. We can even estimate that EPA’s measured Volt battery energy efficiency is about (33.5/37) 90%. Using the EPA’s assumption about all electric driving, the final overall rating should be about (33.5*0.45+37*0.55) 35 MPGe. The 60MPGe EPA rating just nonsense, and clearly the better environmental choice by a factor of (52/35) almost 1.5x is a less expensive Prius of some sort.

A final observation. It follows without further analysis that the EV Chevy Bolt makes no sense either economically or environmentally. And by extension, neither do any other EVs. Economically the Bolt is horrible (and higher priced Teslas are worse). Range is only 238 miles. An hour of 240V recharging provides only 25 miles of range; to get 238 miles requires about 8-9 hours of charging. The Bolt essentially seats four, with only 16.9cf of cargo space. Yet the MSRP is ≥$37500. On a correctly compared environmental ‘global warming’ basis, Bolt has to be even worse than the Volt.

[i] Personal economic data from comparable vehicle functionality. My AWD 2007 Escape Hybrid (small true frame based SUV [not a crossover]) with a class 1 tow hitch is most comparable to the 2007 AWD Escape with a 3L V6 engine and class 2 tow hitch. V6 was 240 HP, my hybrid has a combined 247 HP–153 from the 1.5L I4 Atkinson ICE plus 94 from the electric motor. The 2007 MSRP hybrid premium over the V6 was ~$3400. BUT that year’s federal tax credit for this hybrid was $3500, so we were $100 better off on day one. Better, the AWD V6 EPA combined mileage was 23mpg, while my equivalent Hybrid is EPA combined 30mpg. That is 30% better mileage, saving gas for now 11 years and 85k miles. Best, the V6 used premium, my hybrid uses regular. The price difference in our area is over $1/gallon. So not only less gas, also cheaper gas. The fuel savings work out to about $6700 so far. The NiMH traction battery is still going strong and the vehicle has been basically problem free.

Electric vehicles send real-time data to Chinese government

Tallbloke's Talkshop

Chinese electric car [image credit:]
As one researcher said of the Chinese government: “Tracking vehicles is one of the main focuses of their mass surveillance.” People anywhere can already be tracked via mobile phones, but this takes it a bit further.

When Shan Junhua bought his white Tesla Model X, he knew it was a fast, beautiful car.

What he didn’t know is that Tesla constantly sends information about the precise location of his car to the Chinese government, reports TechXplore.

Tesla is not alone. China has called upon all electric vehicle manufacturers in China to make the same kind of reports—potentially adding to the rich kit of surveillance tools available to the Chinese government as President Xi Jinping steps up the use of technology to track Chinese citizens.

View original post 249 more words

Vehicle Electrification, EV Batteries—A New Hope (Followup)

By Rud Istvan,

EV Batteries—and A New Hope

This is the second of two loosely related guest posts that ctm and I recently discussed, drawing on my subject matter expertise (SME) in energy storage materials and related matters. My SME status was hard won over several intense research years in support of my now globally issued energy storage materials patents for supercapacitors. This post is written for laymen (in the spirit of expert Andy May concerning his recent superb petroleum shale geophysics posts). It omits non-essential technical details (of which there are many) and focuses on electric vehicles (EV), because that is most relevant to global warming concerns and WUWT skeptics. It intentionally contains a lot of ‘terminology’ that enables those interested to follow up with independent internet based research. There are also some related ‘obiter dicta’ making separate points leading to an important ancillary WUWT conclusion. It is written in two parts: Current lithium ion improvements, and A New Hope (an intentional nod to Star Wars Episode IV, because that is what it is).

Current LiIon improvements

All true batteries store electric charge in some form of an electrochemical reaction, either primary or secondary (aka reversible/rechargeable). All are descended from Alessandro Volta’s 1800 invention of his primary ‘pile’ using copper, zinc, paper separators, and brine as electrolyte. (You can make a Volta pile in your kitchen by taking a US penny (outer surface is now plated copper), a sanded US penny (interior is now zinc) and sticking both about ¾ the way into close together but not touching (~2mm separation) slits into a lemon (lemon juice is the electrolyte, lemon pulp is the separator). Good enough for lighting a small Christmas tree bulb for a while (until the undefaced penny’s copper plating is consumed) for your child’s middle school science fair project. Just touch the two bulb wires to the protruding penny edges. Either way works, since this is a DC pile and little Xmas bulbs only care about amp*volt since they heat by resistance ohms. (For the historically inclined, Alessandro’s invention of the battery pile got the volt (electrical ‘force’)–one of three fundamental DC electricity parameters–named after him. [The other two are the ohm (resistance, named after Georg Ohm for his 1827 paper, and the amp named after Andre Ampère for his 1823 current [{charge ‘volume”} speculation—although that was not established rigorously in physics until Maxwell in 1861 in his famous four tensor equation system fully describing all of electromagnetism]. But I digress.) So history suggests Volta’s frog leg twitching battery pile discovery was important. And our modern life, with or without global warming, but definitely with electrics and electronics, confirms history’s judgment.

In the most familiar reversible (rechargeable) commercial battery type, vehicular lead acid (PbA) invented in 1859, the electrochemical reaction is simply lead to lead sulfate and back, using sulfuric acid in water as electrolyte providing sulfate ions plus the disassociated hydrogen ions as electrical charge carriers.

Lithium ion batteries (LIB) are the most energy dense rechargeable electrochemistry presently known, essentially because lithium ions transfer twice the electrical charge of aqueous hydrogen ions (PbA, NiMH). (Since lithium, like sodium, really doesn’t like water, the LIB electrolyte solvent must be a water free organic [aprotic] solvent, hence their infamous flammability.) LIBs were initially developed in the 1980s. The conventional rechargeable LIB used in portable electronics and EVs is an electrochemical ‘rocking chair’ subtype, where on charging the lithium ions resident in the metallic cathode intercalate into the carbon (graphite) anode via an aprotic (organic, usually propylene carbonate [PC] or acetyl nitrile [AN] electrolyte) containing a dissolved lithium salt such as LiPF6. This intercalation process electrochemically reverses on discharge, just like like PbA.

There are several metallic LIB cathode materials, the two most common being Lithium iron phosphate (LiFePF4) and Lithium cobalt oxide (LiCoO2, ‘LCO’)—often with other added metals like nickel and manganese (LiNiMnCoO2, ‘NMC’). The cobalt cathode types have the best energy density so are the most common—hence legitimate media concerns about future cobalt supply. The alternative “Peak lithium” concern is mostly fake news, as lithium is the 20th most abundant element on earth. The peak lithium question is cost not abundance, as the inexpensive present supplies come from lithium rich brines or spodumene rich pegmatites. Most cobalt is a minor byproduct of copper ore production, a vastly greater mining proposition already much more depleted in global ore grade.

The individual LIB cells packaged into the battery come in two common form factors:

Cylindrical has the anode/separator/cathode assembly spiral wound and stuffed in a tube (like an AA battery). Tesla uses this form factor, as do Apple’s new iAirPods.


Pouch has the assembly stacked flat like pancakes and sealed with electrolyte in an impermeable ‘bag’. Chevy Volt uses this form factor, as do iPhones and iPads.


These permutations lead to many tradeoffs among cost, energy density (both volumetric and gravimetric), power density, and cycle life as a function of heat dissipation and solid electrolyte interface (SEI) buildup on the carbon anode. Virtually all LIB improvement initiatives focus on reducing cost, enhancing energy/power density, or extending cycle life. Despite much press hype (usually as part of some fund raising scheme), none of these labs/startups are anywhere near volume commercialization, and none solve the fundamental energy density related range anxiety issues for EV’s.

The inescapable LIB range anxiety problem is basic electrochemistry. Although the figures vary some with precise cathode composition, the theoretical limit for LIB LCO or NMC is ~280Wh/kg. The Tesla cell is already 254Wh/kg in 2018! Elon Musk cannot overcome approaching that theoretical limit with his Tesla GigaFactories. Nor can any LIB startup, no matter how innovative they claim to be.

Tesla says its Supercharger stations are an alternative range anxiety solution (20 minutes to 50% charge, 40 minutes to 80%—versus 5 minutes to gas up). BUT what they don’t say is that such rapid charging kills battery life due to rapid charging heat buildup thanks to the inescapable Nernst electrochemistry equation, which is derivable in two separate ways (fundamental thermodynamics and Boltzmann statistical mechanics) insuring Nernst is ‘real’–like the Pythagorean theorem.

Two fascinating related sidebars:

(1) The Pythagorean theorem (in a right triangle, a2+b2=c2 where c is the hypotenuse) has been derived thousands of ways both geometric and algebraic. It is thought the original Greek ‘proof’ was geometric despite Diophantus, since algebra was ‘invented’ much later by the Arab al-Kwarizmi.

(2) Walther Nernst derived his famous equation in 1887, for which he received the Nobel Prize in 1920. Tesla hype is NOT ignoring some minor annoying detail.

A New Hope

The other basic form of direct electrical charge storage is capacitance where no chemical reaction is involved, only basic Maxwell physics. The most familiar is the simple ceramic ‘chipcap’ where charge is stored electrostatically on metallized plates separated by a ceramic dielectric. These capacitors are the modern descendants of the Leyden jar invented in 1745, and are ubiquitous—trillions of tiny chipcaps worth $billions per year, used in all electrical and electronic devices. In the following image all those little variously sized, both ends white tipped, brownish things are chipcaps.


Passive filtering components for the A11 on an iPhone 8 Plus PCB

The most energy dense capacitor is a supercapacitor (aka ultracapacitor aka ELDC), where the charge storage mechanism is the interface between two phases of matter and the storage is in the Helmholtz’ ‘electrolytic’ double layer capacitance (DLC), first explained by him in 1888. This is the electrostatic physics mechanism that produces lightning in thunderstorms. (As an aside, the motto of my NanoCarbons LLC company holding my materials patents is “Lightning in a Bottle”, for good reason.) Most supercaps use special expensive high purity activated carbons for both the anode and cathode, and a standard aprotic solvent with a lithium salt (or cheaper salt equivalents such as TEMA or TEA) as the electrolyte. Supercaps have between 10 and 100 times the power density of the best power dense LIB, but only about 1/10th the energy density. Their main advantage is where power density and cycle life are paramount. Supercaps have tested cycle lives >106 compared to LIB with at best low single digit 103 when babied. A $billion plus market today, and about a $250 million plus carbon materials market (which suffices for NanoCarbons LLC).

It turns out that it is possible to create a hybrid cell that is half LIB and half DLC. The details are complicated, but the basics are simple. Lithiate the carbon anode rather than the (also carbon) cathode of what would otherwise be a supercap, with LiPF6 as the electrolyte salt. This hybrid is called a Lithium Ion Capacitor (LIC).

In 2007 and 2008, Subaru head of R&D Dr. Hatozaki presented prototype data (at the 17th-18th annual International Seminars on DLC and Hybrid Energy Storage Devices) for LIC cells with very attractive measured properties.

Subaru was looking for a replacement to standard automotive lead acid batteries (PbA) that would have a significantly enhanced cycle life with more energy/power density in a PbA size without excessive cost. Subaru’s motivation was an under hood battery replacement for mild hybridization like the Valeo system, that did not kill cycle life via the Nernst equation. They used a standard activated carbon for the cathode, lithiated graphite for the anode (with a very clever first charge lithiation scheme using a wrapped lithium metal foil mesh), and standard LIB LiPF6 as the electrolyte salt in PC solvent. The result was a 3.8 volt device (better than ~3.6V LIB and much better than supercaps at 2.7V for basic electrochemical potential breakdown reasons beyond the scope of this post) with a demonstrated 20,000 cycles (95%SoC to 45%SoC [Δ2.2V] at a 40C rate at 80°C (Holy Nernst equation!!) for simple under the engine hood replacement where an ordinary PbA otherwise sits but even beefed up fails early and often in mild hybrid applications.

But, Subaru decided LIC enabled mild hybridization did not make commercial sense (see companion post ‘Electrification Common Sense). So they licensed their LIC technology to JM Energy. It is sold as the Ultimo and used in specialty applications like industrial UPS (backup/reactive power/peak support). A Subaru commercial near miss, despite Dr. Hatozaki’s brilliant R&D success.

The supercapacitor energy density limitation that LIC seeks to overcome is directly related to the effective (carbon) surface (per gram or cc) upon which the Helmholtz double layer can form, and to the voltage at which it can operate for adequate cycle life. Activated carbons have high total surface areas, but surprisingly low effective surfaces. (Full disclosure: My NanoCarbons inventions cost effectively increase effective surface about 50% using patented tricks, lowering cell costs by 20-30%.)

Growth of vertically aligned closely spaced multiwall carbon nanotubes on a metal current collector via a chemical vapor deposition (CVD) process provides very high effective surface (an MIT Ph.D thesis). But CVD is difficult to scale and quite expensive.


The 2009 MIT spinout company that attempted to commercialize this technology for EV’s has received tens of $millions in DARPA and DOE grants, but has struggled to get beyond very high priced very small niche specialty markets. It survived, barely, mostly on continued government R&D support rather than product sales.

When Geims got the 2010 Nobel Prize for discovering graphene, it was surmised by many that graphene based structures could solve the effective surface problem more easily and cheaply than vertically aligned carbon nanotubes. Graphenes are essentially single atom sheets of carbon (like an ‘unrolled’ single wall nanotube, only with greater XY area). They are extremely strong, highly conductive, and fairly easy to make. Graphene Energy (spun out of Ruoff’s nanotech materials group at U. Texas Austin) investigated this energy storage possibility. Ruoff converted graphite oxide (GO) to graphene in an aqueous solution using acid. Their problem was that the resulting graphenes clump thanks to Van der Waals forces, and the effective clump surface was no better than NanoCarbons LLC but much more expensive. Graphene Energy failed and folded.


What this failed company’s research suggested was that some inexpensive way to make a robust unclumped graphene structure might be a path forward.

Given that background, imagine my SME shock reading in 2016 that Henrick Fisker has just founded a new electric vehicle company plus a new ‘battery’ subsidiary, Fisker Nanotech, claiming >400 mile battery range plus very rapid charge time in a lithium/graphene device. The HOLY GRAIL according to MSM PR! For those who do not know about him, Henrick Fisker is a famous Danish supercar designer (Aston Martin DB8 of James Bond movie fame, amongst others). He started a US electric supercar company before Musk’s Tesla. Alas, the sourced batteries exploded over 100 times in his Karma cars (really bad karma). Then his LIB supplier A123 Systems (a nanotech spun out of MIT) imploded into bankruptcy losing $250 million of US subsidies and grants plus $100 million for investors, after being sold to China for ~$200 million. Fisker Automotive quickly followed, whose investors promptly lost an additional $1.4 billion.

Can there be any credence to Fisker’s 2016 announced phoenix like rise from his EV Karma ashes? He has funding, so somebody believes. But then, many somebodies also believe Elon Musk and his LIB GigaFactory. The credibility question requires untangling a fascinating technology development web that leads to a new LIC technology. The patent applications for Fisker’s PR’d New Hope have now published. The most important of several are US20170149107 (Hybrid electrochemical cell) and US20170369323 (Production on a large scale). Interested readers can go examine the technical invention details for free using the simple application number search function at the USPTO website.

In what follows we explain simply what Fisker is up to, and how the New Hope invention came about. There are several subparts, producing a combined plausible commercial breakthrough. Each is yet another self-contained energy storage R&D mini-saga teaching lesson.

Thread one is the invention of laser scribed graphene (LSG) in 2012. Then UCLA Ph.D student El-Kady in Prof. Kaner’s nanotech lab made the LSG breakthrough. He took ordinary graphite oxide, coated it onto an ordinary DVD disk using water, then ran the dried DVD disk through an ordinary commercial HP DVD Lightscribe. (Lightscribe used a 780nm [infrared] 5 mW LED laser to inscribe a DVD label/illustration onto a DVD surface coated with heat sensitive dye, each scribe track about 20 microns wide, total full disk pass for a ‘label’ about 20 minutes.)

HP has since stopped selling LightScribe technology because it is monochromatic and not durable. Another commercial near miss.

The LSG lab process produced about 8μ thick 3D graphene structures in DVD sized sheets via simple laser heat reduction of graphite oxide to graphene. These graphene films are extremely mechanically robust because of 3D edge interlinking.


He further showed that six passes of the Lightscribe laser (each ~20 minutes per dvd) improved conductivity many fold. He made a high effective surface, mechanically robust, highly conductive graphene structure for supercaps. Ph.D granted along with a major Nature paper. This was reported and intensively discussed at the ISDLC conference in 2012. We global ‘experts’ discounted it, because the Nature paper showed the electrode thickness was only ~8 microns and the reported supercap energy density was nothing exceptional in aqueous phosphoric acid electrolyte at maximum 1 V, practically useless since energy stored is a function of voltage squared and supercaps were at that time already at 2.7V. We were probably right about the Nature paper, but (mea culpa) probably wrong on its subsequent New Hope implications.

Thread two is the subsequent 2015 El Kady and Kaner development of an asymmetric hybrid device based on LSG. Their new hybrid combined LSG graphene carbon supercapacitance with (subsequently electrodeposited nanoparticle) MnO2 pseudocapacitance. Total voltage was now 2 V, up from 1 V. Still not a lot of stored energy, but perhaps interesting for specialized niche applications like transdermal drug delivery via electroporation according to hyped UCLA PR. Yawn.

Thread three is from recent LIB research. Lithium titanate has been an object of intense study for several years as a safer, energy denser alternative to traditional intercalating graphite for LIB anodes. There is a big problem. The material’s conductivity is very poor, so its power density is grossly inadequate, and its charging time excessive even for cell phones and laptops. Graphene is extremely conductive. So this nanotech research focused on somehow incorporating conductive graphene into the bulk of lithium titanate at a nano-level in order to improve anode conductivity.

There have been two ‘recent’ seminal lab research ‘breakthroughs’. Both use nanotechnology and the idea of graphite oxide plus chemical precursors to lithium titanate, with the final material mix formed in a single heat treatment synthesis. One paper used an aerosol process. The other paper used a sol gel process. [Guo et. al., Electrochemica Acta 109: 33-38 (2013), available outside paywall via google as an posting.] These newish papers present two different lithium titanate precursor chemistries together with graphite oxide deposited using two different methods for a subsequent single nanocomposite heat synthesis.

Fisker Nanotech did not said anything specific in their massive 2016 fundraising PR about their ‘battery’ other than it uses graphene and lithium (their patent applications published more than a year AFTER their big PR funding push). My SME supposition in 2016 was that they had a new manufacturing method LIC. A mechanically robust LSG graphene cathode plus a mechanically robust hybrid graphene/lithium titanate anode, anodes synthesized in one step from triple precursors using an LSG analog rather than the literature’s sol gel or aerosol. Much easier and cheaper than Subaru’s 2008 anode lithiation. And likely still ~20000 LIC cycle life at a 40C charge rate for much faster EV charging while still meeting 20000 cycle vehicle device life (20000/365 is 27 years at two charges per day). The now published applications show that my initial SME 2016 suppositions were correct.

Fisker also said they had a patent pending machine to make 1000 Kg (/day?) of graphene electrode at $0.10/Kg. That may be a bit hyped, but was not implausible even in 2016 by simply ‘thought experiment’ reengineering of LSG in light of the two LIB lithium titanate anode papers already cited above, before reading now published US20170369323. Following is the written (posted 2016 on Judith Curry’s Climate Etc), thought experiment at the time.

The commercial Lightscribe 780nm 5mW laser has a track width of 20 microns. It took 6 20 minute DVD spins to reach optimal LSG conductivity. Fine for simple lab proof of principle for a Ph.D thesis. Not fine for volume production. But there are cheap commercial solid-state diode 780nm lasers with up to 2 watts (2000 mW) power each. Rather than a lens concentrating the laser power as in the Lightscribe, it could be a lens dispersing 2000mW over a larger area with enough power for 1 pass heat treatment as in the sol gel and aerosol papers. Lightscribe hit 20 microns track width with 5mW 6 times for perhaps a millisecond each for an optimal graphene electrode; that is a total of ~30mW on 20 microns for ~6 milliseconds. A 2000mW 780nm laser could hit a 1.3 millimeter stripe with the same total power at the same scan speed. Or an even wider track with a slower scan rate (more likely for a bulk production machine).

Imagine a paper machine like system. The anode furnish box equivalent is continuously spreading a water based graphite oxide (GO) plus 2 lithium titanate chemical precursors slurry onto a rapidly moving continuous support substrate equivalent to the Lightscribe DVD. First step beyond the furnish box, evaporate the furnish water with radiant heat and fans. Second step, IR heat nanosynthesize using powerful 2W spread focus 780nm lasers, converting GO to graphene and the lithium titanate precursors to interspersed lithium titanate nanocrystals. This finished anode material is still supported by the rapidly moving continuous support belt. Third step, peel off and spool up a finished continuous electrode sheet as wide and long as wished as the support belt turns under at the end of the machine for its return trip.

Imagine a second identical machine making the complementary 3D graphene only cathode by simply leaving out the lithium titanate precursors from the furnish box mix. Big rolls, made very fast and cheap. Spooling up very thin but very strong electrodes, made continuously in bulk very cheaply. No expensive aerosol or sol gel or CVD small batches as in the previous lab papers and commercial attempts.

Imagine assembly of Chevy Volt like prismatic pouch ‘battery’ cells. Cut the electrode materials to size before or after stacking as many layers (with separators) as wanted from the spooled rolls; they are very conductive so simple contact likely suffices. No backing metal current collector is needed like for LIB anodes and supercap anodes and cathodes (a cost and weight saving). Attach a current collector to one end (the hybrid MnO2 patent application describes simple silver soldering at the external case connection point of the stacked layers). Encapsulate in pouch, fill with electrolyte, seal—just like Chevy Volt cells.

Form a battery pack similar to Chevy Volt/Bolt, with fewer interleaved thin aluminum heat extraction plates needed. Done except for the control electronics.

The basic cell and battery production steps have already been developed by GM. Continuous sheet nanoelectrode production is analogous to conventional papermaking, substituting purpose build evaporation/ LSG for the draining mesh belt/heat calendaring of paper machines. Every other needed technology element has been shown in the lab. Thanks to optics and LED infrared lasers, scale up appears to be a matter of straightforward engineering rather than invention.

Concluding comments

First, the various asides in this guest post were intended to make a fundamental science/technology point indirectly. Battery electrochemistry is NOT a ‘new’ invention like semiconductors in 1949. Nor does it follow Moore’s law as warmunists might wish. (This is also true for PV, but proving that is way beyond the technical scope of this post. See guest post Grid Solar Parity at Judith Curry’s Climate Etc for a factual take on that subject.) Battery technology is now a very tough slow slog, nothing like what global warming activists fantasize.

Second, in 2017 Fisker announced his coming EV supercar will NOT initially use the Fiskers Nanotech revolutionary LIC that he (fundraising) PR’d in 2016 as discussed in this post. Fisker will instead use conventional LIB pouch cells from Korea’s LG Chem, the supplier to the Chevy Volt (to newly be discontinued in 2019 by GM for apparent reasons predicted in loosely companion post Vehicle Electrification Common Sense). The path from lab to commercial scale production is long, fraught, and uncertain. Fisker just proved that truism again.

Nanotechnology enabled LIC is the only plausible EV option on the present technical horizon. It is truly the only New Hope. But like the rest of the Star Wars saga, it presently exists in another galaxy far far away.

100% Renewable Deception

PA Pundits - International

By David Wojick, Ph.D. ~

Press coverage of the crusade for 100% renewable electricity invariably talks about wind and solar energy. As I have pointed out, the wind and solar fantasy requires a stupendous amount of battery storage, which is never mentioned. This is because the 100% feasibility studies are deceptive.

Wind and solar come with the big battery problem. Here it is at its simplest. America uses about four trillion kilowatt hours of juice a year. If we generated all of that using intermittent wind and solar, something like 70 to 80% of the time it would have to come from batteries, not from the original renewable generators. Just how many billions of KWh of batteries that would take is a complex computation, but it is a bunch. Millions of container sized batteries for sure. The only question is how many millions?

For those advocating 100% renewables for…

View original post 744 more words

The cost of wind & solar power: batteries included

From Euan Mearns’ Energy Matters:

For some time now we here on Energy Matters have been harping on about the prohibitive costs of long-term battery storage. Here, using two simplified examples, I quantify these costs. The results show that while batteries may be useful for fast-frequency response applications they increase the levelized costs of wind and solar electricity by a factor of ten or more when used for long-term – in particular seasonal – storage. Obviously a commercial-scale storage technology much cheaper than batteries is going to be needed before the world’s electricity sector can transition to intermittent renewables. The problem is that there isn’t one.


Making detailed estimates of the future costs of intermittent renewables + battery storage for any specific country, state or local grid requires consideration of a large number of variables, plus a lot of crystal-ball gazing, and is altogether too complicated an exercise for a blog post. Accordingly I have made the following simplifying assumptions:

* The grid is an “electricity island” – i.e. no exports or imports.

* It starts out with 30% baseload generation and 70% load-following generation . Renewables generation, including hydro, is zero.

* Baseload and load-following generation is progressively replaced with intermittent wind and solar generation, with baseload and load-following generation decreasing in direct proportion to the percentage of wind + solar generation in the mix. This broadly analogs the approaches a number of countries have adopted or plan to adopt.

* Annual demand stays constant.

* Enough battery storage is added to match wind + solar generation to annual demand based on daily average data. Shorter-term variations in generation, which will tend to increase storage requirements, are not considered. Neither is the option of installing more wind + solar than is necessary to meet demand, which will have the opposite effect but at the expense of increased curtailment (see this post for more details).

* Transmission system upgrades are ignored.

Two cases are considered. Case 1 uses actual wind generation, solar generation and demand in Germany in 2016 and Case 2 actual wind generation, solar generation and demand in California in 2017. I used Germany and California partly because seasonal wind and solar generation in Germany tend to offset each other while they reinforce each other in California, and partly because I have grid data for both. It should be noted, however, that the results are not predictions of what might happen in Germany and California because local conditions are not taken into account.

On future costs I made the following assumptions:

* Capital cost for utility-sized wind plants = $1,500/kW, solar = $1,000/kW (based on various sources).

* Batteries: The best estimate I came across was in an article from Bloomberg New Energy Finance, according to which:

The global (battery) energy storage market will grow to a cumulative 942GW/2,857GWh by 2040, attracting $620 billion in investment over the next 22 years.

2,857 GWh costing $620 billion works out to $217/kWh. I have assumed $200/kWh, about half current utility-scale Li-ion battery costs.


Case 1 applies the assumptions listed above to wind generation, solar generation and demand in Germany in 2016 using daily average data from P-F Bach.

Figure 1 shows shows Germany’s actual wind and solar generation in 2016. Wind generation, which is highly erratic, peaks in the winter while solar generation, which is far smoother (at least when presented as daily averages) peaks in the summer:

Figure 1: Case 1 actual wind and solar generation, daily average data for Germany

Figure 2 compares combined wind + solar generation during 2016 with demand. Adding solar to wind tends to flatten out annual generation but does not make it an ideal match to demand:

Figure 2: Case 1 wind + solar generation vs. demand, daily average data

Figure 3 compares 2016 demand with combined wind + solar when wind + solar is factored up so that it generates 100% of total annual demand. Generation broadly follows demand during the year but the erratic wind generation creates periodic surpluses of up to 80,000 MW and deficits of up to 50,000 MW:

Figure 3: Case 1 combined wind + solar factored to 100% of demand vs. demand, daily average data

The first graphic on Figure 4 plots these surpluses and deficits. There is no well-marked seasonal pattern. The second plots the GWh of storage needed to match these surpluses and deficits to daily demand. Here we see a seasonal pattern, with surplus energy generated from wind in the winter and spring having to be stored in the summer for re-use in the coming winter, and with storage capacity reaching a maximum of slightly over 25,000 GWh (25 TWh) in May. Given the erratic nature of wind generation, however, this pattern might change if a different year were considered:

Figure 4: Case 1 daily surpluses, daily deficits and storage balance. Note that the Y-scale is in GWh

Case 2 applies my assumptions to wind generation, solar generation and demand in California in 2017 using daily average grid data from the California Independent System Operator (CAISO) supplied earlier by correspondent “Thinks Too Much”.

Figure 5 shows California’s actual wind and solar generation during 2017. Wind and solar generation both peak in the summer:

Figure 5: Case 2 wind + solar generation, daily average data

Figure 6 compares the sum of wind & solar against demand. Summing the two results in a pattern that broadly matches the summer peak in demand but not the “air-conditioning” peaks in June, July and August:

Figure 6: Case 2 wind + solar generation vs. demand, daily average data

Figure 7 compares 2016 demand with combined wind + solar when wind + solar is factored up to generate 100% of total annual demand. There are large surpluses in the spring and early summer months and large deficits in the winter months:

Figure 7: Case 2 combined wind + solar factored to 100% of demand vs. demand, daily average data

The first graphic on Figure 8 plots these surpluses and deficits and the second shows the GWh of storage needed to match them to daily demand. The large surpluses from mid-March through the end of June and the large deficits in November through February combine to generate a storage requirement for Case 2 that also approaches 25,000 GWh (25 TWh) even though demand is only about half of Case 1 demand:

Figure 8: Case 2 daily surpluses, daily deficits and storage balance. Note that the Y-scale is in GWh

Now to costs. I estimated the combined wind and solar levelized cost of electricity (LCOE) without storage from the NREL LCOE calculator using the following assumptions:

  • Period 20 years
  • Discount rate 3% (NREL default value)
  • Capital cost $1,250/installed kW
  • Capacity factor 25%
  • Fixed O&M $25/kw-year (NREL default value)

Other variables were set to zero or ignored.

The combined wind + solar LCOE without storage was $50/MWh, broadly in line with Lazard’s 2018 estimates for utility-scale solar and wind.

I then estimated wind + solar LCOEs with battery storage capital costs included. This was a straightforward exercise because reducing baseload + load-following generation in direct proportion to the increase in wind + solar generation results in LCOEs that are the same regardless of the percentage of wind + solar in the generation mix. The NREL calculator showed:

  • LCOE Case A: $699/MWh
  • LCOE Case B: $1,096/MWh

These ruinously expensive LCOEs are entirely a result of the added costs of storage batteries, which in the 100% wind + solar scenarios approach $5 trillion in both Case A and Case B, compared to wind + solar capital costs of ~$300 billion in Case A and ~$160 billion in Case B.


Despite claims to the contrary battery storage is clearly not an option for a low-cost 100% renewable future. And lest I be thought a lone voice in the wilderness, a recent report by the Clean Air Task Force confirms that my estimates are in the ball park. And the CATF is of a distinctly green persuasion:

Every year people produce almost forty billion tons of carbon dioxide that is pumped into the atmosphere – that’s a hundred times faster than the Earth has ever seen. If we don’t take action, our planet will change far faster than we can adapt. This is the mother of all environmental problems and the Clean Air Task Force is on it.

I can’t find the CATF report on the web, but its results were reported by, among others, the MIT (Massachusetts Institute of Technology) Technology Review:

Fluctuating solar and wind power require lots of energy storage, and lithium-ion batteries seem like the obvious choice—but they are far too expensive to play a major role.

The Clean Air Task Force, a Boston-based energy policy think tank, recently found that reaching the 80 percent mark for renewables in California would mean massive amounts of surplus generation during the summer months, requiring 9.6 million megawatt-hours (9.6 TWh) of energy storage. Achieving 100 percent would require 36.3 million (36.3 TWh).

Building the level of renewable generation and storage necessary to reach the state’s goals would drive up costs exponentially, from $49 per megawatt-hour of generation at 50 percent to $1,612 at 100 percent. And that’s assuming lithium-ion batteries will cost roughly a third what they do now.

In summary, wind and solar may indeed undercut coal and nuclear on price when the costs of intermittency are ignored, and batteries may indeed be good for short-term grid stability applications. But please let’s not have anyone claim that solar + wind + batteries will usher in an era of cheap, clean, 100% renewable energy, because they won’t.