Time to Worry

August 5, 2012

A universal law states that popular press depictions will always rile up academics who inhabit the particular scientific village under scrutiny. My number came up a few times over the last year when a handful of articles on computers’ energy efficiency made the nerd-press rounds—like these two from the Technology Review and, from the Atlantic, one under the headline “If a MacBook Air Were as Inefficient as a 1991 Computer, the Battery Would Last 2.5 Seconds.”

And like any good bottom-dwelling researcher, I’m just chuffed to see the rest of the world acknowledging that my chosen topic might be worth humanity’s collective time. The computational energy efficiency advantages we’ve made over the decades have been crucial to making devices better and more useful. And these articles rightly challenge the notion that performance is the only lens through which to view the history of computation.

But, again like any self-absorbed academic, I have a bone to pick with the simplified, optimistic messages espoused by these stories. The gist of each article is that the efficiency trend is bound to continue, bringing yearly advances in capabilities, portability, and apple pie. One even gives the trend its own highfalutin’ name: “Koomey’s law.” The casual nerd reader is led to believe that all-important energy trends will continue, leading to new devices—just around the corner—that will make the MacBook Air look like an SUV.

The truth is less rosy. You might even call it dire. The trend of consistent year-over-year energy efficiency improvements is in the process of falling off a cliff. For real-world evidence, look no farther than the “Turbo Boost” swing on Intel’s latest mobile processors: a part that’s capable of running at 3.3 GHz, say, is forced to slow down to 2.3 GHz most of the time by thermal and power constraints. The chips can meet their full potential only in short bursts without overheating or killing your laptop’s battery. A different popular-press piece, this one in the New York Times, summarizes the more pessimistic (or perhaps just realistic) take via work by one of my compatriots at UW, Hadi Esmaeilzadeh, and others. In short, the forces that brought us consistent efficiency improvements over the decades are, as you read this, starting to abandon us.

This is not to say that we, the computing community, will never again be able to make computers more efficient. When we’re not writing boring blog posts on the subject, we’re looking for radical new ways to move efficiency forward. But we can no longer rely on silicon technology improvements to bring us consistent benefits “for free.” We need to find smarter, better ways to take computers into the energy-constrained future.