Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 26

Overview of Data Center

Energy Use
Bill Tschudi, LBNL
WFTschudi@lbl.gov
Data Center Definitions

 Server closet < 200 sf


 Server room <500 sf
 Localized data center <1,000 sf
 Mid-tier data center <5,000 sf
 Enterprise class data center 5000+ sf

Focus today’s training on larger data


centers—however most principles
apply to any size center
EPA report to Congress—
Breakdown of Space
Data Center Efficiency
Opportunities

Benchmarking of over 25
centers consistently lead
to opportunities

No silver bullet

Lots of silver bb’s


Energy Efficiency Opportunities Are
Everywhere • Better air management
• Better environmental conditions
• Move to liquid cooling
• Load management
• Optimized chilled-water plants
• Server innovation
• Use of free cooling

Power Server Load/ Cooling


Conversion & Computing Equipment
Distribution Operations

• On-site generation
• High voltage distribution
• Waste heat for cooling
• Use of DC power Alternative
Power • Use of renewable
• Highly efficient UPS systems
Generation energy/fuel cells
• Efficient redundancy
strategies
IT Equipment Load Density
IT Equipment Load Intensity

100

90

80
2005 Benchmarks
70 Ave. ~ 52

60

50
2003 Benchmarks
Ave. ~ 25
Watts/sq. ft.
40

30

20

10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
Data Center Number
Benchmarking Energy End Use

Electricity Flows in Data Centers

HVAC system

local distribution lines

lights, office space, etc.

uninterruptible

load

to the building, 480 V

computer
UPS PDU computer racks equipment

backup diesel
generators
UPS = Uninterruptible Power Supply
PDU = Power Distribution Unit;
Overall Electrical Power Use
in Data Centers

Courtesy of Michael Patterson, Intel Corporation


Performance Varies

Computer
Loads
The relative percentages of the 67%

energy actually doing computing


varied considerably.
Lighting Other
Office Space 2% 13%
Conditioning
1% HVAC - Air
Movement
Electrical Room 7%
Cooling
4% Lighting
Cooling Tower Data Center 2%
Server Load
Plant HVAC -
4% 51% Chiller and
Pumps
24%

Data Center
CRAC Units
25%
High Level Metric—
Percentage of Electricity Delivered
to IT Equipment
IT Power to Total Data Center Power

0.80

Average .57
0.70

0.60

0.50

0.40
Ratio Higher is
0.30 better
0.20

0.10

0.00
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Source: LBNL Benchmarking Data Center Number


Alternate High Level Metric –
Data Center Total Electrical Demand/
IT Equipment Demand (PUE)
Total Data Center Power/IT Power

3.50

Average 1.83
3.00

2.50

2.00

Ratio
1.50

1.00 Lower is
better
0.50

0.00
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Source: LBNL Benchmarking Center Number
HVAC System Effectiveness

We observed a wide variation in HVAC performance

HVAC Effectiveness Index

4.0
3.5
3.0
2.5
2.0
1.5
1.0
0.5
Ratio
0.0of ITtoEquipment
Power HVAC Power
1 2 3 4 5 6 7 8 9 10 11 12 14 16 17 18 19 20 21 22
Data Center Number
Benchmark Results Can Help
Identify Best Practices

The ratio of IT equipment power to the total


is an indicator of relative overall efficiency.
Examination of individual systems and
components in the centers that performed
well helped to identify best practices.
Best HVAC Practices

 Air Management  Cooling Plant


 Air Economizers Optimization
 Humidification  Water Side
Control Economizer
 Centralized Air  Variable Speed
Handlers Chillers
 Low Pressure Drop  Variable Speed
Systems Pumping
 Fan Efficiency  Direct Liquid Cooling
Best Electrical Practices

 UPS systems
 Self-generation
 AC-DC distribution
 Standby generation
Best Practices and IT Equipment

 Power supply
efficiency
 Standby/sleep
power modes
 IT equipment fans
 Virtualization
 Load shifting
Best Practices—
Cross-Cutting and Misc. Issues
 Motor efficiency  Heat Recovery
 Right sizing  Building Envelope
 Variable speed drives  Redundancy
 Lighting Strategies
 Maintenance  Methods of
 Continuous charging for space
Commissioning and and power
Benchmarking
Potential Savings

 Electrical bill will exceed the cost of IT


equipment over its useful life
 20-40% savings typically possible
 Aggressive strategies – better than 50%
savings
 Paybacks are short – 1 to 3 years are
common
Scenarios of Projected Energy Use from EPA
Report to Congress 2007 - 2011
The Good News:

 Industry is taking action


– IT manufacturers
– Infrastructure equipment manufacturers
 Industry Associations are leading:
– ASHRAE
– Green Grid
– Uptime Institute
– Afcom
– Critical Facilities Roundtable
– 7 X 24 Exchange
IT Industry Taking Action

www.climatesaverscomputing.org.

www.thegreengrid.com
More Good News:

 Utilities are getting involved:


– PG&E, SCE, San Diego
– CEE
 CA incentive programs are aggressive
 California Energy Commission, DOE, EPA all
have data center initiatives
Design Guidelines Were Developed
in Collaboration With PG&E

Guides available through


PG&E’s Energy Design
Resources Website
Design Guidance is Summarized in
a Web Based Training Resource

http://hightech.lbl.gov/dctraining/TOP.html
Resources summary sheet available
Take Aways

 Various meanings for “data centers”


 Benchmarking helps identify performance
 Benchmarking suggests best practices
 Efficiency varies
 Large opportunity for savings
 Resources are being developed

You might also like