Stunning Figures on Energy Efficiency

Over the past 60 years, the energy efficiency of ever-less expensive logic engines has improved by over one billion fold. No other machine of any kind has come remotely close to matching that throughout history.

Consider the implications even from 1980, the Apple II era. A single iPhone at 1980 energy-efficiency would require as much power as a Manhattan office building. Similarly, a single data center at circa 1980 efficiency would require as much power as the entire U.S. grid. But because of efficiency gains, the world today has billions of smartphones and thousands of datacenters.

From Mark Mills at Real Clear Energy.

Comments

Just imagine how much bigger the gains would have been had the industry been properly regulated!

Thread over. Don Pardon, tell Alistair what he has won!

The asanas, or postures of yoga for asthma, are built tto help with asthma during exercise and daily activities.

An allergic condition regularly implies reaction gainst a mixture of several substances.

Examples of ingredients in such medicines are: Methylsulfonyl
methane, vitamin C, vitamin E, Echinacea purpurea, Quercetin,
grape Seed, Stinging nettle, Coleus Forskolin.

The reality is that Google and all the high tech giants have moved their large computer facilities to the Pacific NW to get the cheap power BECAUSE their server farms use enormous amounts of power.

I am tired of winning. President Captain Bolsonaro has ordered art works out of his oficial Palace. There is nothing the opposition can do to stop us. We are #1!!

Are computers just super easy to make efficient, or are producers of other goods not sufficiently motivated

It's just an artifact of Moore's Law. As transistors become smaller and smaller, naturally you can fit more of them on a chip, so the chips become more powerful, and so does your computer (and phone, and network router, and everything else).

But the smaller transistors also require less power to operate. And without getting too deeply into it, the smaller power swings inside the chip allow the chip to operate faster, which has encouraged everyone to design chips that use as little voltage as possible -- for speed reasons much more than for environmental reasons.

And finally, with so many transistors available to fit on one chip, you can now very easily add power-saving logic to surround your actual logic. This would be a second circuit that determines when your first circuit isn't needed, and powers it off. Sort of like a motion-sensing light that shuts off when you leave a room.

It's a function of the laws of physics that govern the production of the goods. Physical engines are governed by mechanical and electromagnetic properties that are currently very difficult to optimize. Computational engines are governed by information theory, and some physical limits, but were much lower down the optimization curve.

Another big difference computational engines use computational engines to design and build new ones. So as last generation gets more efficient, the rate of progress increases. In the 1950's circuits were designed by hand. In the 80s, we used CAD. Now we use machine-learning based solvers to layout circuits (better than any human can).

How many billions of these billions of manufactured cellphones already populate some of the world's busiest landfills?

How are Apple, et al., managing the reclamation and recycling of all those nifty keen components and gadgets? (What IS the proportion today of active cellphones vs. dead and defunct cellphones? Surely the latter soon must outnumber the former . . . .)

(By the way: where IS that long-promised, long-awaited colonoscopy app? I scan the headlines eagerly each day to learn of its advent.)

Really? Do some people throw away their old phones?

I would think this is mainly a threat to the world's desk drawer and junk storage box capacity.

"How many billions of these billions of manufactured cellphones already populate some of the world's busiest landfills?"

Fewer than you might think. There is a large global secondary market, and post-consumer use a rather well-developed e-waste and reclamation industry that few in the US or EU ever see. That said it is extremely environmentally unfriendly.

"How are Apple, et al., managing the reclamation and recycling of all those nifty keen components and gadgets?"

They're not really. Most of the reclamation efforts that do exist are the good-ol' free market niche-filling variety. They do get dinged quite frequently however on environmental concerns related to this activity, which is hardly within their purview to control. Also I assure you the ratio of active:dead cellphones is much much higher for active...likely by an order of magnitude.

"where IS that long-promised, long-awaited colonoscopy app?"

It was unnecessary. Due to other medical information about you and information gleaned from your other apps, the industry already knows intricately the health of your colon, as well as the date of your death. This information has already been auctioned to the highest bidder, and they're waiting for the right time to release said app in child-friendly form as a game for an extra profitability boost. Cheryl Sandberg said the current iteration of the colonoscopy app just didn't scale.

Stuff that lands in landfills is usually pretty benign. A properly-constructed landfill more or less prevents decay, and when it can't prevent it stores it within the landfill, making them pretty environmentally friendly (outside the local environment, which obviously is completely obliterated).

This is a fair question, though. Electronic gadgets use some fairly rare materials. At what point does it become economically viable to mine old landfills?

In most developing countries they do mine old landfills. If we gave access to first world landfills to workers willing to work for less than $2 an hour, I have no doubt those miners would prefer those first world landfills.

It's not the i-phone that uses lots of energy, it's the sprawling infrastructure that the i-phone connects to which uses, and often wastes, massive amounts of energy.

If a smartphone allows you to accomplish a task remotely and save getting in the car and driving somewhere even once per year, it's probably a net energy saver.

A very important point. As a nation our infrastructure is in horrible shape. This includes our power grid, which simply wasn't built to handle the loads we're putting on it these days. I've worked on a number of energy projects (solar, wind, natural gas, transmission lines, even experimental energy storage methods), and all the folks I've spoken with in the energy industry are very worried. We are increasingly relying on electricity, but construction of new transmission lines and upgrading of old ones isn't keeping pace.

Sprawling infrastructure like.........wireless communications? I'd say antennas are a more efficient use of resources than cables and pipes.

Largely depends on where you are. In mountainous regions cables can be more efficient, because the sheer number of antenna necessary to maintain coverage. Plus, reliability can be an issue--it's easier to scatter radio waves than to scatter telephone wires.

By the by, and to the point of considering energy efficiency: let us think for at least two moments of JUST HOW MUCH cellphone drivers are MURDERING the energy efficiency of automotive traffic with their stupid driving at speeds well below posted limits, inordinately long pauses at traffic lights, inattention to moving traffic, et cetera?

How are these captured "energy efficiencies" percolating through the economy these days?

The optimum traffic speed for energy efficiency is, like, 50 mph. So they are probably helping.

Of course, if they make people like you really angry, and you subsequently drive like maniacs, that probably cancels out some of the gains.

Observing cellphone idiots, lunatics, and stooges parking their cars fifteen or twenty miles an hour below posted limits while INSISTING upon keeping their phone booths rolling in left lanes designated for higher speeding traffic does not enhance my composure most days, no, at least not on those days when I'm obliged to be out in my car.

I have been observing poor cellphone driving practices for, what, only twenty or twenty-five years?????

I can hardly begin to imagine ALL the efficiencies of gasoline consumption that we and our struggling planet have benefitted from in this era of enhanced "energy efficiency".

left lanes designated for higher speeding traffic

In reality, the left lane is for through traffic, the right lane is for local traffic. When approaching an interchange, if a driver is exiting the highway he should be in the right lane, if not, the left. Traffic speed isn't the primary consideration.

In reality, I don't know that any economist has dared calculate how much inefficiency of gasoline and motor fuel consumption has been fostered with the advent of cellphone driving over the past quarter-century or so: cellphone driving, from several anecdotal perspectives, looks to contribute high inefficiency to motor fuel consumption each and every day no matter the locality, the posted speeds, or any traffic signage or signals (which leads to questions of how any claimed micro-circuitry efficiencies contribute to the staggering amounts of waste of time and resources perpetrated by our inefficiently multitasking public).

And even with that billion fold increase, opening your e-mail client and typing in it takes about the same amount of time today as it did in 1998.

The same applies to word processors, and a number of other types of software. Almost as if energy efficiency is not really all that relevant a measure.

And this is just silly - 'Today, humanity fabricates 1,000 times more transistors annually than the entire world grows grains of wheat and rice combined.' Strangely, the size of wheat and rice grains has remain unchanged, while transistors have shrunk dramatically, if not a billion fold. In other words, the amount of rice and grain produced by ton still likely outweighs the weight of produced transistors by a not precisely trivial factor.

Yes, most of the gains have gone into code inefficiency. But it is much easier today to write code that does moderately sophisticated things.

.... but the relevant compute unit isn't weight of transistors, so..... I guess the apples to apples comparison would be calorie load of rice and wheat (which is only approximated by weight), but measuring the weight of transistors is completely irrelevant.

'but the relevant compute unit isn't weight of transistors'

There is no relevant 'compute unit' involving agricultural products compared to transistors.

What has simply happened is that the size of transistors has shrunk immensely. To the point that currently, it is not really possible to imagine shrinking them much further, due to various problems related to that small size. As noted here, from a couple of years ago - https://arstechnica.com/gadgets/2016/07/itrs-roadmap-2021-moores-law/

HAS the size of wheat/rice remained the same? My assumption is that it has not. I know that hybridization and other manipulation of cereal crops have had a profound effect on them--lowering nutrient density of the stalk, decreasing size of the stalk, shifting how the grain ripens, that sort of thing. Modern wheat would be as unrecognizable to a Victorian era farmer as modern construction methods would be. Add in genetic engineering, and I expect that the pace at which cereal crop technological advancement is occurring is accelerating at a pretty good clip.

Interesting question about the size of a grain. Probably around the same is the best answer, as there are a lot of variables. Rice being less changed is a bit more likely.

Given the amount of change in cereal crops over the past hundred years or so, I'm not as comfortable saying "probably about the same". Certainly the amount of edible food produced per plant has gone up dramatically; it's just a question of how that happened. Other plants have clearly increased in size--watermellons and bananas are nearly unrecognizable from their original forms, for example. Same with Red Delicious apples.

"And even with that billion fold increase, opening your e-mail client and typing in it takes about the same amount of time today as it did in 1998."

Not for me. With the switch to SSD means my computer boots up in a tiny fraction of the time my 1998 machine took and opening MS Word (on the rare instances I do that anymore), takes under a second. And, of course, my current machine does things my 1998 machine couldn't begin to handle (for example, to edit 20MP digital images and process HD digital video).

It takes about as long to open my computer now as it did back in the late 1990s. That said, capacity has skyrocketed. The security programs alone on my work computer would have killed my first computer.

This has real applications in science. Calculations that used to require supercomputers can now be done on personal computers, allowing MUCH more research to be conducted. When I was in college it was routine for computers to be taken up with certain calculations; this still happens, but the calculations have gone from "PCA for these dozen fossil sites" to "Let's simulate the planet"!

'With the switch to SSD means my computer boots up'

Which is something else.

'and opening MS Word (on the rare instances I do that anymore)'

Very likely because it was pre-loaded, which is an example of energy inefficiency being traded off in exchange for a faster response time.

'And, of course, my current machine does things my 1998 machine couldn't begin to handle'

Nobody is disputing that progress has occurred, but one of the systems I used in the late 1980s could do that (leaving aside HD as something that came later) - at a cost of something like $10,000.

An Amiga with a Video Toaster would do it for something under §2,000 in the early 1990s, however. https://en.wikipedia.org/wiki/Video_Toaster

Many modern devices, taking advantage of Moore's Law, can now achieve the sort of actual performance that was available 25 years ago, using many fewer transistors - and power.

BTW this post front Alex isn’t really about cellphone efficiency he’s a paid consultant to some Blockchain startup now. What the post is really about is “don’t worry about how energy inefficient today’s blockchain schemes are because computers just keep getting g ridiculously more energy efficient”

Imagine--in the old days Bitcoin would have used enough energy to power a billion countries the size of Iceland. Using the electricity of just one small country for a useless purpose isn't that bad, if you think about it.

I'm convinced. Buy! Buy! Buy!

Essentially what Tabarrock’s post is about. Gotta earn his “consulting” fees

You really need to remember to take your meds.

Merry Christmas. The heretics at the Niskanen Center have issued their manifesto denouncing the libertarian ideology and proposing a program for maximizing human freedom that relies on real-world facts: https://niskanencenter.org/wp-content/uploads/2018/12/Niskanen-vision-paper-final-PDF.pdf

I think these figures are wrong. What they are doing is comparing apples (or Apples, sic) to oranges. The device from 1980 does not need billions of transistors to work, unlike today's device, so you cannot extrapolate back then to now (or vice versa go from now to back then). Back then, there was limited functionality but the phone worked to deliver voice, but it did not have a calendar or Siri or whatever other functions, which were done by either another dedicated machine or thumbing through the Yellow Pages. But +1 to the site for mentioning the Jevons Coal Paradox.

Bonus trivia: Intel has hit a snag in its "tick-tock" cycle and the 10 nm process node: ominous? But they are coming out with a new architecture next year, which some say is less than revolutionary but time will tell.

Miss Philippines crowned (tiaraed? sashed?) Miss Universe. Congrats Ray.

data centers already existed in the 80s..... or was a human teller hiding inside every ATM?

You forgot to mention that your family is wealthy.

Considering what the US accomplished before 1980 vs after how important is this energy efficiency? Would our inventive efforts have been better spent on other avenues?

Since the energy efficiency was necessary to allow Moore's law to continue, probably Not.

Without energy efficiency gains, processors would be too hot effectively cool and to function. Heat is a major design constraint for successful microchip functionality.

I mean this is impressive, but it's pretty much just a rephrasing of Moore's law. Gate capacitance scales linearly with transistor size. So if an integrated circuit doubles its density every 18 month, then energy usage per FLOP will basically halve in the same period.

Comments for this post are closed