Over time, technological progress results in products that use less energy, are more compact and use up fewer raw materials. Does this add up to enough resource savings to make life on Earth more sustainable? A new study from MIT has examined modern technology through the lens of a 150-year-old economic theory of efficiency and resource consumption – and in almost all cases, the benefits of reducing required resources is cancelled out by the increase in consumer demand for them.

The idea that consumer demand will always outpace gains in efficiency is far from new. It dates back to 1865, when improvements to steam engines reduced the amount of coal they required, and conventional wisdom held that overall coal consumption should, logically, go down. But to the contrary, economist William Stanley Jevons suggested that the resulting increase in consumer demand would instead drive overall coal consumption up. He didn't believe that the progress of technology alone could be counted on for society to "dematerialize," or reduce the amount of resources consumed. The effect became known as Jevons' Paradox.

Dematerialization is an important ecological consideration, and the MIT team wanted to test how well some of today's products were tracking. Transistors, for example, have improved to a dizzying degree over the past few decades, thanks to steady improvements to the silicon-based semiconductors at their core. But rather than decrease worldwide use of silicon, having smaller and more efficient transistors instead increased the amount of devices that could be created, like smartphones and tablets, and silicon consumption exploded, up 345 percent since the 1970s.

"Despite how fast technology is racing, there's actually more silicon used today, because we now just put more stuff on, like movies, and photos, and things we couldn't even think of 20 years ago," says Christopher Magee, one of the study's authors. "So we're still using a little more material all the time."

To study the effects of Jevons' Paradox, the team developed a model to weigh up the opposing forces of dematerialization and consumer demand, and by applying it to a series of products, services, materials and components, determine whether resource consumption was increasing or decreasing for each of those goods.

The team's equation takes into account variables like population and economic growth over time, how regularly a given product will advance technologically, and how much consumer demand for a product varies in relation to its price. They fed data from 57 different resources into their model, including chemicals like ammonia, formaldehyde, styrene and polyester fiber, and technologies like transistors, hard drives, laser diodes, crude oil, aluminum and devices to capture energy from solar and wind sources.

Although almost all of those materials had been subject to technological advances made over the years, the researchers couldn't find one instance of true dematerialization. Wool came close, with its use dropping significantly in recent years, but the researchers argue that it isn't a result of improved efficiency of wool products, but the substitution of other materials, like nylon and polyester.

Other materials declined in use for other reasons: things like asbestos and thallium are no longer used widely, but that's more a result of discovered health issues and subsequent government intervention, rather than any advances in technology. The researchers conclude that, as Jevons suggested, technology alone isn't enough to put us on the path to a sustainable future.

"What it's going to take is much more difficult than just letting technological change do it," says Magee. "Social and cultural change, people talking to each other, cooperating, might do it. That's not the way we're going right now, but that doesn't mean we can't do it."

The research is published in the journal Technological Forecasting and Social Change.

Source: MIT