We all know that our computers are pretty bad for our environment. They're not easily destroyed when they become obsolete and contain lots of hazardous materials such as lead, carcinogens and other toxins. But this remarkable essay published by Midrange Server, Lean, Mean Green Machines, is focused on one particular point: the waste of electricity -- and other resources, including coal, used to produce it. This waste at the same time costs us money and 'destroys' our environment. Timothy Prickett Morgan, the author, estimates that $213 billion are absolutely wasted in a single year because of inefficiencies due to poor utilization rates. And while you might disagree with him on the number, you should listen to his arguments. He's also pointing at how grid computing could save electricity and reduce waste by increasing the average utilization of our machines. He even imagines a world where rich countries would sell excess computing capacity to developing nations at a fraction of the cost. It's really an eye-opening article.
Here are some selected excerpts (emphasis is mine).
IT professionals usually don't like to mix politics with shop talk, but for this essay, such mixing is not only necessary, it is exactly the point. That electricity that we waste on our computers is not free. Every time we have a blackout or a brownout on the antiquated electrical grids in the United States or Europe, which were not designed to handle such loads as modern conveniences require, we pay in lost business and disruptions to our lives. Every week, month, and year, the tax man takes some of our salaries so the various militaries of the West can secure oil supplies in the Middle East, still used to generate a small portion of our electricity. To be fair, oil is used mostly for transportation, but only because we have so many cars. If we had fewer cars and mass transportation in Western economies, we'd be burning oil, which is cheaper and cleaner than coal, to make electricity. Since the OPEC oil embargo in 1974, Western economies have adopted coal as a fuel for generating electricity, which is cheaper and more readily available in the West, but it is also dirtier, so we pay by having a more polluted environment.
After this introduction, let's review some numbers.
[Five years ago, Peter Huber and Mark Mills in Forbes figured that a PC required about 1,000 watts of power to operate (and this was using 1999's slower chips and smaller screens). At the time, the average home Internet user was online about 12 hours a week, which worked out to 624 kilowatt-hours a year. If you assume that Internet and PC use was up in the past five years, you're probably talking about 1,000 kilowatt-hours per PC. Back in 1999, consumers in the United States accounted for about 50 million PCs, with the remainder being business PCs. The ratio is probably not 1:4 consumer-to-business PCs, as it was in 1999, but is probably closer to 1:2. That ratio is important because business PCs run 40 or more hours a week instead of a dozen. That means that a business PC could be using as much as 2,000 kilowatt-hours to operate a year. If you extrapolate these ratios and power consumptions worldwide, that's 250 billion kilowatt-hours for home PCs and 1 trillion kilowatt-hours for business PC users. You heard that right: 1.25 trillion kilowatt-hours a year.
And if you use your PC or server at 5 or 10 percent of its total CPU capacity, this means the remaining 95 or 90 percent are wasted, while consuming electricity.
To be sure, there is a lot of wiggle in these numbers. But the magnitudes in these estimates are real. While PCs and servers have come down dramatically in price, their inefficient design and over-powered components make them very costly tools. Assume that electricity costs 10 cents per kilowatt-hour (a very generous price, at least based on the 15 cents I pay in New York City). Electricity prices range from 5 cents to 15 cents per kilowatt-hour in the U.S. on average, and can be double or triple that in Europe and Asia. Also assume assuming a 1.6 ratio for total power consumption (which server blade maker RLX Technologies uses to calculate the secondary costs for air conditioning), the world's PCs and servers together consume 2.5 trillion kilowatt-hours of energy (for themselves and for related environmentals) in a year or $250 billion in hard, cold cash a year. Assuming that a server or PC is only used to do real work about 15 percent of the time, that means about $213 billion of that was absolutely wasted. If you were fair and added in the cost of coal mining, nuclear power plant maintenance and disposal of nuclear wastes, and pollution caused by electricity generation, these numbers would explode further.
This makes you think, even if these numbers are hard to verify.
Even if the estimate for the base $250 billion in electricity costs relating directly to computers were off by an order of magnitude -- which I do not think they are -- there would nonetheless still be a tremendous motive in trying to make computers more efficient. The estimates above were just for PCs and servers alone. Now add in storage arrays, printers, routers, switches, hubs , embedded controllers in factories and buildings, and countless other computers that are weaving their ways into all kinds of devices and places where you wouldn't have dreamed of putting a computer before. Do the economics on all of these computers now.
How many power plants could be closed if PCs and servers were not designed to maximize profits for Intel, Microsoft, Hewlett-Packard, IBM, Dell, Sun Microsystems, and others, but to minimize energy use as performance is maintained at adequate levels? How much energy and money could be saved if machines were designed from the get-go to be misers and only used juice when they needed it? How longer would battery life be if engineering of electronic devices tried to minimize their power profile ahead of all other features and functions? These are good questions that are well worth asking. People are beginning to ask them.
The author then looks at possible solutions to reduce this enormous waste, like processors using less power, like the ones produced by Transmeta or VIA Technologies. He also thinks that grid computing and virtualization are 'green.'
There's been a lot of buzz around grid computing in the past few years. Every PC sold today should come with vendor-authorized grid software installed and allowing end users to pick the research organizations and charities to which they can donate their excess computing power for free; alternatively, vendors should be compelled to set up an open CPU cycle exchange that would allow end users to sell their excess capacity on an open market. If most PCs and servers are running full-out and do not have sophisticated power management features, something useful should be done with all that excess capacity.
Imagine if rich Western nations could sell their excess computing capacity to developing nations at a fraction of the cost of actually having these nations invest in their own IT infrastructure to do sophisticated number crunching. Provided that countries did not engage in the development of weapons, giving this computing capacity away or charging a modest fee for it would be the decent thing to do. This approach would probably not make IT vendors happy, since they are relying on developing economies for revenue and profit growth, and almost by definition their unhappiness with such an idea would only serve to prove what a good one it is.
After reading these excerpts, or the full original article, I bet you'll power down your computer when you don't need it -- at least, for a few days.
And for more information for other kinds of pollution created by our computers, read this former story, China Serves As Dump Site For Computers.
Source: Timothy Prickett Morgan, Midrange Server, February 23, 2004
10:22:36 AM
Permalink
|
|