United States fact of the day

Percentage of annual net electricity generation by renewables in 1948: 32

Percentage of annual net electricity generation by renewables in 2005: 11

The main difference of course is the fall in the relative import of hydroelectric power.

By the way, those numbers are read off a graph and thus are approximate.  They are from p.67 of Mara Prentiss, Energy Revolution: The Physics and the Promise of Efficient Technology, new and noteworthy from The Belknap Press of Harvard University Press, recommended.


Renewables in 1948 presumably also meant the wood burning stove.

For electricity generation?

@Granite26 - good spot, but I would argue that a lot of electricity is used for space heating, like those old incandescent bulbs heating rooms in winter and those old fashioned electric radiator space heaters. Back to you.

That makes no sense, even for you.

Yes it makes sense. Lots of electricity is wasted as space heating. And let's not forget you can read a book by the light from a fireplace. Back to you.

From that perspective, if you count additional electricity replacing wood-burning use, the numbers would probably look even 'worse'

There are wood burning electric generators now......

Uh-oh, he's gone native.

The main factor I'm sure is electrical use. From http://www.rmi.org/RFGraph-US_electricity_demand I'm estimating 1948-1950 the US was using say 500 TWh/y. Now let's say about 3,000. That would make for maybe 160 renewable units in 1948 and 330 units now. So renewable generation doubled.

Population grew about 111% during that period while electric use grew by 600%. But the question I think is will electrical use continue to grow at this pace? Probably not IMO. I only need one fridge for my food. I only need one central air unit to cool my house. Light bulbs are getting lighter and lighter in their electrical use thanks to CFCs, LEDs and Sarah Pailin's failure to champion the old fashioned light bulb. My laptop/tablet/cell phones do nibble on my electricity but absent a lot more electrical car use I don't see myself having any good reason to double my use of electricity in a generation.

"but absent a lot more electrical car use I don’t see myself having any good reason to double my use of electricity in a generation.

That's a pretty huge caveat. Electric cars, assuming current trends, will displace small passenger vehicles in the medium term (25-50 years).

Perhaps, but the net 500% incrase in electrical use represented IMO mostly new uses of energy. People in 1950 read a magazine but it wasn't on a coal powered tablet computer, it was on paper. The one exception might be cooking where the microwave offset some use of the stove and there are quite a few people that use electric ovens. But lots of old school electric stuff is getting more efficient (see light bulbs, air conditioners etc.)

A mass conversion to electric cars would offset energy that was previously from oil and gas. I think if you backed that out you would not find in the next 60 years that our individual electric use will grow again as rapidly as it did in the past.

German renewable energy statistics in December 2014- more electricity was produced by wind than the monthly average month electric generation from nuclear plants. ('Laut Angaben des Internationalen Wirtschaftsforums Regenerative Energien in Münster, die sich auf vorläufige Daten der Strombörse EEX beziehen, produzierten Windenergieanlagen in Deutschland mit knapp neun Terawattstunden (TWh) im Dezember so viel Strom wie nie zuvor in einem Monat. Bundesweit speisten Windräder demnach 8,85 Milliarden Kilowattstunden Strom in die Netze ein und übertrafen damit deutlich die bisherige Rekordmarke vom Dezember 2011, die bei 8,4 Milliarden Kilowattstunden gelegen hatte. Damit produzierten Windkraftanlagen mehr Strom, als die deutschen Atomkraftwerke im Monatsdurchschnitt – der lag im Jahr 2013 bei acht Milliarden Kilowattstunden. - http://www.neueenergie.net/wissen/wind/wind-uebertrumpft-atom)

Those poor birds

More birds are killed by the toxic waste dumps of coal plants, of the tar sands, of the ponds of water from oil and gas wells. In the US, oil wells are producing more water than oil, and that is excluding the water injected.

This water is too polluted to be used untreated for fracking. The capital cost of cleaning wastewater just to be used for fracking which involves adding organic chemicals and sand has been higher than buying drinking or agriculture water.

But hey, hiring tens of thousands of workers to clean up the waste of fossil fuels to save the birds would make wind power cheaper than burning fossil fuels and in a time of high unemployment, we can not afford the high cost of hiring workers to build capital assets instead of burning capital and creating wasteland.

Not to mention housecats.

You're implying that well wastewater is left on the surface when it isn't cleaned up, which just isn't true. Water is injected into wells for reasons other than fracking, including disposal.

There is probably some, even a significant amount of environmental impact from the water in the time it's on the surface, but it's not an ecological disaster where we're pumping up 90% water slurry and leaving the polluted water just laying around.

You're new here, aren't you? You do not argue with mulp; you just have to appreciate his unique worldview. Wait until he gets going on Reagan.

It's true in California:

Whole lotta benzene in that water too:

I recall reading that the firming power for Germany is largely provided by French nuclear plants, who are most happy and willing to sell at higher prices to the rest of "green" Europe. (Firming power runs your toaster and hair dryer when the wind don't blow.) I suspect that was not factored into the statistics you site. I don't know for sure, I don't read German.

My memory isn't so good, but I seem to remember reading that Germany mothballed most of their nuclear capacity after Fukushima.

Yes, and replaced them with coal when wind didn't produce what was expected.

Is there some back-story to this? Does Merkel's brother-in-law own a coal mine?

It's hard to believe anyone would advocate the path Germany seems to have taken, especially anyone with an inkling of sincere environmentalism in them...

The only way I can think to explain this is that there's graft and corruption going on, but I don't seem to be hearing that in the news.

The back story is that Germany is dependent upon coal, gas and oil imported from Russia.

The other back story that Germany burns domestic Braunkohle (lignite), which is pretty much the bottom of the dirty fossil fuel bucket - and that lignite provides around 45% of the coal burned here. And as of 2014, the numbers of Russian gas and oil imports are these - 'And Germany's dependence on Moscow is even higher: the country sources 36 percent of its natural gas imports and 39 percent of its oil imports from Russian energy suppliers.' http://www.dw.de/germanys-russian-energy-dilemma/a-17529685

Basically, Russian natural gas is vastly cleaner, and if the Russians are selling, the Germans are buying. Mainly because burning and mining lignite is just about as stupid as mountaintop removal in the U.S.

I don't understand this. The argument is that Germany is beholden to Russia, so it eliminates nuclear power to become more beholden to Russia?

I suppose I could understand that if the domination was complete and total, but that just doesn't seem to be the case yet.

It’s hard to believe anyone would advocate the path Germany seems to have taken, especially anyone with an inkling of sincere environmentalism in them…

Well, there's your mistake right there, the modern Green movement has long ago forgone such quaint notions as "environmentalism" and "science".

"The only way I can think to explain this is that there’s graft and corruption going on, but I don’t seem to be hearing that in the news."

Don't forget that they manage to do it all for only 3 times the price per kWh of the US. Indeed, Germany's costs are twice the costs of France's electricity. And France's electricity production is overwhelmingly pricey Nuclear.

Well, yes - and the companies that invested in modern coal plants lose money whenever they run. Seems like buying coal and servicing the capital costs of those new plants is a losing proposition.

Interesting note - the big four utilities in Germany argued they needed to have new coal plants approved due to the Atomausstieg, and literally less than a week after gaining that approval, tried to have the lifespan of their reactors extended beyond the previously agreed to legal limit. Unfortunately for the nuclear plant owners, Fukushima occurred soon after that, pretty much putting an end to the entire endeavor - and leaving them with a bunch of coal plants that will likely never make a profit generating electricity. And even more unfortunately, helping ensure that the Greens won the election in Baden-Württemberg, a Bundesland that had pretty voted CDU for 50 years straight, and pretty much wiping out the FDP, as they were the party most closely connected to the utilities' attempt to ignore the will of a majority of German voters to shut down the nuclear power plants.

And in a market with a spot wholesale electric price of around 3 cents, coal plants are just too expensive run and make money.

Almost as if the Energiewende is actually working.

The spot price regularly goes into multiple tens of euro cents, though, judging from this website: https://www.energy-charts.de/price.htm.

"Almost as if the Energiewende is actually working."

Energiewende is an epic disaster. German's are paying ridiculously high prices for electrical power and the electric companies are still going bankrupt.

Anybody know the real story behind the shutdown of the San Onofre nuclear power plant in Southern California?

You have to be aware that wind power doesn't scale. To be cost-effective, a wind plant has to be located in an area of moderate to high, relatively constant wind. The site also has to be reasonably accessible, have lots of area available to space out the turbines, and be reasonably close to consumers to avoid transmission losses.

Most of these high quality sites are now in use - the low hanging fruit has been picked. And as you move away from the best sites to second-best, the cost of wind power goes up dramatically.

Given the rise of server farms as a big electricity consumer, and the fact that new ones need to keep on being built, they should be out in the plains states where the wind blows. Then you only need to worry about data cables.

Just being on the plains does not guarantee the right kind of wind. The key is to have a reasonably constant wind. An area where it blows at gale forces sometimes and is calm at other times is useless, even if the average wind speed is high.

Have a look at this wind map: http://www.nrel.gov/gis/images/30m_US_Wind.jpg

The cost effective areas will be found where the map is bright red and purple. Get away from those, and the cost of wind power skyrockets.

And note that not all that red and purple land is usable - it depends on terrain, access, how constant the wind is, whether the land is in a national park or environmentally sensitive area, yada yada.

Many of those red/purple areas already have wind farms if the land is suitable.

"The cost effective areas will be found where the map is bright red and purple. Get away from those, and the cost of wind power skyrockets."

That's not really true anymore. Wind farms have been built in the less optimal areas in the US, but the size of wind turbines has gone up enough so that the efficiency and greater height from the ground has offset the lower wind quality. The draw back for wind is primarily transmission line bottlenecks and intermittent electrical generation.

Most of these high quality sites are now in use – the low hanging fruit has been picked

Offshore wind farms right off the coast line would seem to be the lowest hanging fruit of all, and we haven't even plucked one yet in the US.

Offshore wind farms are currently a marginal technology. They produce somewhat more power, but cost a lot more money. That may change if the size of wind turbines keeps rising.

You might be right as http://instituteforenergyresearch.org/wp-content/uploads/2013/06/Offshore-Wind-Energy-DRS-4.pdf seems to agree (I don't know if they are a serious outfit or a fake lobbying one though). http://en.wikipedia.org/wiki/Cost_of_electricity_by_source seems to indicate it is only marginally more expensive.

I suspect a problem with the cost estimates are that onshore wind has been around a while now so we've gotten good at building it and keeping it more efficiently. Offshore less so. It seems to me that offshore has much greater potential to be lower cost in the long run. While you have to spend more to build a windmill that can survive in the open ocean, that windmill will tap stronger and more consistent energy and as you point out making it bigger can lower its cost per unit.

Another problem, though, with the cost estimate is that I'm not sure they captured real estate costs. Real estate costs a lot in the US, especially on the coasts. On the ocean you are not competiting for plots of 'land' with people who want to put up shopping malls and condos. I'm not sure if the cost estimates properly account for that savings.

The main difference is in the amount of pillage and plunder destruction of capital assets and rejection of building capital assets.

Unless you call blowing up productive forested mountains and reducing them to unproductive wasteland to be building capital.

Unless you consider destroying wetland with pollution and cheap transport canals leading to returning millions of productive land to the oceans to be building capital.

And this is being done to kill jobs building capital assets at times of rising under employment and rising unemployment based on paying workers to be too great a cost to profits and limiting GDP growth by limiting the wage income of consumers to be the best economic policy.

One can justify pillage and plunder in times of labor shortage, like in the 40s, 50s, 60s, but by the 70s, the boomers were creating a huge labor surplus, so switching from pillage and plunder to building productive capital would be the best policy.

Except economists in the 70s suddenly redefined profit from a sign of economic inefficiency from monopoly to profit being virtuous and the need for increased government promotion of monopoly and leaving factors of production unused became virtuous because capital gains are best achieved by monopoly, not by labor building capital.

(Profit in economic terms is not return on invested capital, at least not in economic terms of the 60s, but instead a sign of monopoly power if sustained, or a signal for increasing investment in production in the sectors showing profits. With energy sector showing extremely high profits, economists were often opposed to investment in energy, backing the high profit fossil fuel industry interests in maintaining high profits by shifting fossil fuel costs to other sectors of the economy.)

Excellent point; this really speaks to the facts that:

1. Electricity consumption has increased a lot in the last 60 years and
2. Hydroelectric is not a very scale-able technology (limited by geography).

However, I would still emphasize the importance of the renewables ex. hydro (some biogass and geotherm, but mostly wind and solar). Solar and wind now account for between 30-50% of marginal electricity (depending on time periods, but over 50% for the 11 months ending November 2014 vs. same time YAGO) [see EIA's monthly energy review, full year out 3/26]. Even though wind is running into transmission capacity constraints, these can be expanded in the long-term, and solar adoption at all levels (res., commercial, and utility) are extremely low.

Longer run, I'm more interested in issues surrounding utilities' attempts to block greater adoption of solar, now that it's becoming significant in some parts of the country (NC, CA, NV).

I recently had solar panels installed on my house. Started the process last November. Actual installation time was a little more than a day. Still waiting for the utility company to get their meter installed (another 5-10 business days left to wait; they never bother scheduling it).

The amount of paperwork and bureaucracy I've had to endure to get this set up just goes to show why solar prices are so much higher per watt than they are in the rest of the world.

Maintaining the grid is a non-trivial task, and if you are making energy at home that can create serious risks for line workers (9th most dangerous job in America).

This doesn't mean their failure to act is excusable, but there is a reason the utility needs to sign off on your installation.

We have permitless rooftop solar in Australia. It is much simpler, cheaper, and quicker. Before subsidy or tax we now pay an average of $1.61 US per watt for a median sized installation, which I understand is about half what Americans pay. It is the cheapest source of electricity available to Australian households. Where I am, South Australia, almost one quarter of homes have rooftop solar, although these systems average much smaller in size than American installations. Right now, at one o'clock in the afternoon, rooftop solar is providing 17% of the state's total electricity use.

Since we're not doing anything magical here in Australia, there is no reason why the United States can't get down to Australia's solar installation costs.

But if you want to feel good about something your country is efficient at, it would take me a very long time for me to get a gun.

Wow. That's less than half. Mine came out to about $3.77/watt before subsidies. Definitely wouldn't have been worth it without them, but the local prices are high enough to make it worth it with the subsidies. They use a Value of Solar Tariff pricing instead of Net Metering here, too, and judging from the Wikipedia article, our rate is about a fifth of Australia's Feed-in Tarrif.

The feed-in tariff for new rooftop solar in Australia typically ranges from zero to six Australian cents. Most people installing new solar can expect to get about 6 cents for each kilowatt-hour they export to the grid, which is about 4.7 US cents. Fortunately, very few people get zero cents for the electricity the export to the grid.

Do you worry about drilling holes in your roof to install the solar panels? My roof is wonderfully watertight and I would worry about drilling holes. Do they have a good solution for that?

Steve, I'm sorry, but I have to admit I laughed when I read your comment. Now roof technology may be very different in the United States, but we have two main sorts of roof here in Australia, tin and tile. Tin roofs are sheets of metal held on with screws. A normal solar installation uses the screw holes that already exist to hold the roof on. The original screws are removed and the racking that holds the solar panels are attached with screws of the appropriate standard that are one size larger than the ones removed. The aluminium L bracket that holds the racking is separated from the roof by a rubber pad to prevent galvanic corrosion. Normally no more holes are placed in the roof than already exist. Tile roofs are slightly different. They use backects that are attached to roof beams and stick out from under the tiles. But tile roofs are a series of holes covered by overlapping tiles. They're not sealed and for very good reason. Before the development of electric fans to ventilate air in the roofspace, and before insulation was a thing in Australia, a tin roof was a poor man's roof because they didn't let the air circulate and they got too hot in summer and were too noisy when it rained.

Now I have no idea how things are done in America. You have shingles on your roofs over there? That's a disease in Australia. They're made out of birch wood or something then nailed on and covered with tar? You're going to have to ask someone else what is done with them, I really can't be of help to you, I'm afraid.

But if you want to protect your roof, placing solar panels over it is a very good way to go. They can take golf ball sized hailstones without breaking.

The U.S. standard tends to be plywood sheets, with shingle/roofing nails used to hammer in shingles, normally asphalt. In other words, if his roof is constructed in typical American East Coast style (apparently, in areas where wildfires are common, Californians prefer wood shingles, not tile), it already has a couple of thousand holes in it.

"They’re made out of birch wood or something then nailed on and covered with tar?"

For most of the US shingles aren't made of wood, as p_a says they are made of a solid asphalt compound with and adhesive backing placed on top of a felt like material. They are nailed on, with the adhesive and felt acting as a water barrier around the nail hole. In addition, only the top half is nailed and the bottom half overlays the nailed top half of the row of shingles below it. So, it's an additional water barrier.

To the point though I think your standard installer is likely to guarantee the work and it's unlikely to leak.

Steve, I should have mentioned that if you are worried about having your roof penetrated, you can have non-penetrative solar. Now personally I see nothing wrong with roofs having penetrative solar, provided the proper procautions are taken, but each to their own.

The first method, exterior non-penetration, can be as simple as removing the backing from light weight, flexible solar panels to expose the adhesive and sticking them on the roof.

The second method, deep non-penetration, involves using roofing material that incorporates PV so your roof is never penetrated because the roof is the solar panel.

Do you have a reference for the utilities blocking adoption? It's not something that's crossed my radar.

Here's an example from Arizona.



Oh, I just gave a lecture on grid integration of non-dispatchable renewables.

The utilities* have a few problem with solar & wind.
1. Ensuring frequency & voltage stability is more difficult. The corresponding decrease in quality of service and is a main driver of renewable curtailment.
2. Transmitting and distributing the electricity. Wind and Solar are not necessarily produced conveniently close to our transmission system. Indeed there are regularly negative prices of electricity due to local wind production.
3. Distributed generation causing seemingly dead wires to be live. Dangerous for linesmen.
4. The "Utility Death Spiral." About half your electricity bill is for the power lines, but it is hidden in the hourly rates. If you have residential solar, your power consumption goes down but your demand on power lines may not. That gives insufficient money to maintain the transmission and distribution system.

And of course solar and wind do not match their output to the demand, so they require significant storage for greater than 30% adoption. Solar varies by a factor of two between summer and winter, and it's not clear how to store energy economically for that long.

* by utility, I mean the ISO's, RTO's and vertically integrated grid operators.

"Longer run, I’m more interested in issues surrounding utilities’ attempts to block greater adoption of solar, ..."

Utilities will only attempt to block solar if it costs them money. The regulations that force poor people to subsidize upper income people's solar cells is rather pernicious.

Utilities make money by selling electricity. If their customers start generating their own electricity, the utility faces a new form a competition, which loses them profits. This is why they hate distributed generation in general. It's just that solar happens to be the newest threat to their monopoly.

First, many utilities are regulated and guaranteed a profit, so they aren't worried about losing profit. Secondly, you are ignoring the economics of the situation.

If utilities are forced to pay full retail price for extra solar power, then they most sell that extra power to other residents. Furthermore, at the point in time that their local grid is producing net power during the peak day, they must export the power to a neighboring grid. A neighboring grid will buy the power at wholesale prices. If the wholesale price is less than the retail price, then the local power company takes a loss and that loss is passed evenly to all of it's customers. So, poor people who can't afford solar cells, anyone living in an apartment or who chooses not to buy solar cells is forced to pay higher rates to cover the difference.

Solar cells can be a good thing, but power companies shouldn't be forced to subsidize them.

Let's be clear: hydoelectric is 'renewable'.

Can you elaborate?

In 1948, hydro was essentially the only renewable - there was no plural as solar/wind weren't used.

You can keep using it without burning fossil fuels or producing nuclear waste. As long as there is rain / snowmelt.
In general, a renewable source is energy that comes from a resource replenishable on a basic human timescale. Sure, the earth will make more oil...over 20 million years. That's not very helpful.
However, the future of hydro power is in the sea (aka Marine energy). Where the tides will do the work of generating the electricity...and we don't have to do so much damage to river fish (especially Salmon). But...it is a ways out.

Hyrdoelectric is renewable but at enormous environmental, human, and capital costs. Nuclear faces the same shortcomings, but has been generally much safer.

That's what I thought you meant. It's renewable, but not necessarily hippie friendly.

*Over-consumption of water notwithstanding.

Yes, it's not 'hippie friendly' - but many if not most of those concerned with renewable energy also desire environmentally friendly energy. This point is entirely relevant to the conversation.
The relative decline in power is not an accident; hydro is, as I stated before, 'renewable'. Nuclear, solar, and wind are much stronger options.

How do those "enormous" costs for hydro compare, on a per watt basis, to the environmental , human and capital costs of wind power generation.

Are any of the purportedly environmentalist opponents of hydropower even curious?

'hydoelectric is ‘renewable’.'

Only when there is enough precipitation - though this web site seems to believe that the market can solve the problem of empty reservoirs. Sort of an 'imagine precipitation' joke, but based on what was actually presented here by a professor of economics.

When we first started building hydropower dams, we did it to prevent flooding and provide a constant water supply for farmers. Electricty was a side benefit.

What 'we'? - water power has been used for centuries just about everywhere, and most of the original hydropower generation was in the form of hooking a generator up to an already existing mill race/wheel.

There is no question that larger projects happened later, on larger rivers, of course.

For a full data table, see the DOE's Energy Information Administration - http://www.eia.gov/totalenergy/data/monthly/pdf/sec7_5.pdf

Hydropower is up 170% from 1950 to 2013, but most of the growth happened by 1970 and little thereafter. Meanwhile, total demand is up 11x over the period, meaning coal, natgas, and nuclear had to make up the difference.

Wind and solar don't even make up 1% of electricity supply until 2008, but by Nov 2014 they were 7.5%. These newer technology renewables are showing explosive growth in market share, and this is a relatively recent phenomenon.

I have a question for the engineering types here. How much electricity is typically lost in transmission due to resistance and whatnot as a function of distance? I know that depends on what the wires are made of and the voltage, but generally speaking if you're trying to transmit electricity what percentage are you losing per mile?

Here's ERCOT's, generally speaking you're talking between one and 2 percent on Transmission I haven't looked at distribution in a while but there's opportunity to reduce losses on the distribution side. (See Volt Var optimization for an example)


Its way more than one percent. The EIA estimates it as at least 6%. And, that is just for the long haul portion, the transformer in your computer may double that number.


How much is typically lost because I leave my computer on overnight

In the winter, not much -- effectively zero, if you're heating with electricity anyway. We have baseboard heat in our cottage -- when it's cold, it doesn't matter at all if we leave computers, TVs, and lights on or not (waste heat is not wasted).

Yes and no. If you are in your bedroom sleeping, it doesn't help to be heating your home office a floor away. And unless you have a temperature zone controller in that office, adding extra heat won't cause your furnace to run less - you'll just wind up with a room that's too warm.

Depends.... Are you leaving it idle? Or is it running something in the background? Bitcoin generator? SETI at home? Anything like that?

Is it a laptop or a desktop? Separate monitor? Do you leave the monitor on too?

If your computer go into power-saving or standby mode, its power consumption should be trivial: 1 to 10 watts per hour. But if it's a powerful computer running applications while you sleep, power consumption could be in the hundreds of watts per hour. Your monitor is probably consuming 40-60 watts while it's on.

Worst case - your computer is running and doing something, and your monitor is left on and doesn't go into energy saver mode. You might be consuming 300 watts per hour, so over an 8 hour night you would be using 2.4 kwh. Over a typical month, about 72 kwh.

How much that costs depends on the state you are in. Hawaiians pay about $.35/kwh. If you live there, leaving your computer running full speed for 8 hours a day would cost you an extra $25/mo or so. On the other hand, the mountain states pay more like $.10/kwh for electricity, so leaving your computer running at night would cost you about $10/mo.

If it's on full standby and your monitor is in energy-saver mode, the energy consumed is trivial. Less than 10 watts per hour, less than $1/mo in energy costs.

Separate monitor? Do you leave the monitor on too?

If it's a CRT, that is probably more than the rest of the computer put together. Make sure that's going to sleep, or put it in your garage as a backup and treat yourself to a new LCD screen.

Energy losses per mile depend on the type of power transmission, the frequency, the amount of current, etc. Numbers can range from around 1% per 100 miles to more like 5% per 100 miles.

There are also transformer losses when the voltage is stepped up and down for transmission, losses involved if the energy has to be stored in batteries or some other type of reservoir of power, and other losses categorized as distribution losses as opposed to pure transmission losses.

These losses are significant, which is why it's important to locate your power generation as close to the consumer as possible.

Looks like the EIA shows it at about 5% for 2012 which is the most recent year on file. They don't break out T&D on that report though. The truth is it is somewhat difficult to build generation close to consumption (even distributed generation). The easier solution (and necessary one if we're going to have high penetration of DG) is to improve distribution system design. Improve power factor through use of capacitors, improve feeder load balancing and perform better more proactive system planning (sizing transformers correctly, having seasonal switching plans for load balancing). Of course none of this work is particularly headline grabbing and is tremendously technical so don't expect to see a big news article on it except for gross simplifications about a 'smart grid'.


Thanks, that is what I was looking for.

Also thanks to the others below, but I was going for the amount lost as a function of distance, not the actual amount historically lost in the U.S.

In the 1940s, a lot of electricity consumption was going into making aluminum, such as for airplanes. They typically put the aluminum smelter near the hydroelectric dam.

6% according to the EIA

Some of this can be reduced by better use of grid assets. The technical term for this problem is AC Optimized Power Flow (ACOPF).

Wikipedia says, "...a 100 mile 765 kV line carrying 1000 MW of power can have losses of 1.1% to 0.5%. A 345 kV line carrying the same load across the same distance has losses of 4.2%." But it's one of these 'it all depends' type things. Generally electricity isn't transmitted more than 800 kilometers due to losses, but we totally do that in Australia. However, we are looking at reducing the size of our grid by taking remote communities off grid by using solar power and battery storage, as it is cheaper than maintaining those long power lines.

If the US only loses 6% of its electricity from transmission as mentioned, I would be a bit surprised. I thought it was about 7%. In Australia we lose over 7% due to transmission. Or we did. This figured may have improved slightly in recent years due to declining electricity use and improved transmission capacity. (A decidedly unfortunate combination as it turns out, for we now have far more transmission capacity than we actually require.)

In the Pacific Northwest, land of hydroelectricity, most the low-hanging fruit was dammed before World War II. Our newfound love of wind power will take a while to catch up in scale.

I wonder how these figures compare with Canada, where most of the big dams were built more recently?

The US population has more than doubled since 1948 and is expected to double again (at least) over the next century.

The number of rivers with strong hydro electric potential hasn't increased at all...

There just aren't that many promising locations left to build new dams. If anything, the main focus has been on taking down smaller/older dams to allow rivers to flow more freely.

Yes my figures about proportions of total electrical usage in the united states are approximate, because I read them off a graph.

What people say and what they do is often contradictory. Al Gore?

And here is a free market response to renewables in the U.S. by the sort of people that enjoy reading unsponsored Mercatus Center research -

'Three years ago, the nation’s top utility executives gathered at a Colorado resort to hear warnings about a grave new threat to operators of America’s electric grid: not superstorms or cyberattacks, but rooftop solar panels.

If demand for residential solar continued to soar, traditional utilities could soon face serious problems, from “declining retail sales” and a “loss of customers” to “potential obsolescence,” according to a presentation prepared for the group. “Industry must prepare an action plan to address the challenges,” it said.

The warning, delivered to a private meeting of the utility industry’s main trade association, became a call to arms for electricity providers in nearly every corner of the nation. Three years later, the industry and its fossil-fuel supporters are waging a determined campaign to stop a home-solar insurgency that is rattling the boardrooms of the country’s government-regulated electric monopolies." http://www.washingtonpost.com/national/health-science/utilities-sensing-threat-put-squeeze-on-booming-solar-roof-industry/2015/03/07/2d916f88-c1c9-11e4-ad5c-3b8ce89f1b89_story.html

When US rooftop solar installation prices reach Australia's average of about $1.61 US per watt, because many American homeowners have effective discount rates of about 4% or less, it will be able to produce electricity at under 10 cents a kilowatt-hour for anywhere in the sunnier half of the country. And rooftop solar doesn't compete with the wholesale price of electricity, it competes with the much higher retail price. So it is very hard to believe that a large number of Americans won't take advantage of it. I understand they're not fond of leaving $20 notes lying on the street in that country.

prior_approval: "And here is a free market response..."

Then from the quote: "that is rattling the boardrooms of the country’s government-regulated electric monopolies."

So prior_approval, do you need some help understanding the phrase "free market"?

Comments for this post are closed