If you thought computer hardware and software were expensive, consider that the electric bill for a data center can
eat up 25 to 44 percent of its budget. That raises concerns not just
about the bottom line, but also about brownouts or blackouts. A
recent study by Stanford professor Jonathan Koomey found that
computer servers (including cooling and auxiliary equipment) now
account for 1.2 percent of total U.S. electric consumption, a figure
that’s growing fast. (Power consumption by such devices doubled
between 2000 and 2005, Koomey says.)
The situation is so bad that, in December, Congress asked
the Environmental Protection Agency to study the power consumption
of corporate and government data centers; the EPA was
expected to issue its findings late last month (after this issue went to press).
So what’s a company to do? A short list of partial solutions includes:
- Develop a metric that indicates not just the amount of power used but also the output of the data center (measured by transactions or users supported) as a gauge of efficiency, says Chris Bennett, vice president of core systems for storage company NetApp.
- Instead of dedicating highcapacity servers to specific applications, deploy virtualization software that allows fewer servers to handle more work.
- Change data-storage techniques. For example, lessen your company’s dependence on fiberchannel disk drives in favor of conventional drives that are networked to
enhance efficiency, and use “de-duplication” software to detect redundant
data that can be safely erased.
New energy-efficient servers can also help (see “What’s Hot This Summer“), and recent industry
initiatives such as The Green
Grid consortium and the
Climate Savers Computing Initiative
promise to deliver better technology
and smarter management techniques.