Brilliance At A Bargain

dish_ledlight

Dirk Hanson explores the economics of artificial light:

Moore’s Law, a prediction made by Intel co-founder Gordon Moore in 1965, says that the number of transistors packed on a chip will double every 18 to 24 months. More than half a century later, Moore’s Law still holds, although many experts believe it will run its course in a few more years. The lighting field has its own Moore’s Law, an LED counterpart called Haitz’s Law. In 2000, Dr. Roland Haitz, then with Agilent Technologies, predicted that the cost of LED lighting will fall by a factor of 10, while “flux per lamp” (what we call brilliance or luminosity) will increase by a factor of 20 per decade. How long that trend will continue is also a matter of intense debate, but solid-state lighting (SSL) technology is based on semiconductor components, so the technology price fix is in, at least for now, and lighting is likely to keep getting cheaper.

As prices fall, our use of light climbs in exact proportion. For several years now, physicist Jeff Tsao at Sandia National Laboratories has been digging into the economic cost-benefit ratios of artificial lighting. Analyzing data sets spanning three centuries and six continents, Tsao and his coworkers at Sandia have concluded that “the result of increases in luminous efficacy has been an increase in demand for energy used for lighting that nearly exactly offsets the efficiency gains—essentially a 100% rebound in energy use.” The Sandia group’s equations aren’t holy writ, but with remarkable consistency, human beings, when faced with the availability of a cheaper and more efficient lighting technology, simply use more of it.

(Image of close-up of LED lightbulb by Matt Barber)