Professional Documents
Culture Documents
American Society of Heating, Refrigerating and Air-Conditioning Engineers Thermal Guidelines For Data Processing Environments
American Society of Heating, Refrigerating and Air-Conditioning Engineers Thermal Guidelines For Data Processing Environments
ASHRAE has compiled this publication with care, but ASHRAE has not investigated, and
ASHRAE expressly disclaims any duty to investigate, any product, service, process, proce-
dure, design, or the like that may be described herein. The appearance of any technical data
or editorial material in this publication does not constitute endorsement, warranty, or guar-
anty by ASHRAE of any product, service, process, procedure, design, or the like. ASHRAE
does not warrant that the information in the publication is free of errors, and ASHRAE does
not necessarily agree with any statement or opinion in this publication. The entire risk of the
use of any information in this publication is assumed by the user.
SPECIAL PUBLICATIONS
Mark Owen, Editor/Group Manager of Handbook and Special Publications
Cindy Sheffield Michaels, Managing Editor
Matt Walker, Associate Editor
Roberta Hirschbuehler, Assistant Editor
Sarah Boyle, Editorial Assistant
Michshell Phillips, Editorial Coordinator
PUBLISHING SERVICES
David Soltis, Group Manager of Publishing Services and Electronic Communications
Tracy Becker, Graphics Specialist
Jayne Jackson, Publication Traffic Administrator
PUBLISHER
W. Stephen Comstock
Preface to the Third Edition
Prior to the 2004 publication of the first edition of Thermal Guidelines for Data
Processing Environments, there was no single source in the data center industry for
ITE temperature and humidity requirements. This book established groundbreaking
common design points endorsed by the major IT OEMs. The second edition,
published in 2008, created a new precedent by expanding the recommended temper-
ature and humidity ranges.
This third edition breaks new ground through the addition of new data center
environmental classes that enable near-full-time use of free-cooling techniques in
most of the world’s climates. This exciting development also brings increased
complexity and tradeoffs that require more careful evaluation in their application
due to the potential impact on the IT equipment to be supported.
The newly added environmental classes expand the allowable temperature and
humidity envelopes. This may enable some facility operators to design data centers
that use substantially less energy to cool. In fact, the classes may enable facilities in
many geographical locations to operate year round without the use of mechanical
refrigeration, which can provide significant savings in capital and operating
expenses in the form of energy consumption.
The recommended operating range has not changed from the second edition of
the book. However, a process for evaluating the optimal operating range for a given
data center has been introduced for those owners and operators who have a firm
understanding of the benefits and risks associated with operating outside the recom-
mended range. The third edition provides a method for evaluating ITE reliability and
estimating power consumption and airflow requirements under wider environmental
envelopes while delineating other important factors for further consideration. The
most valuable update to this edition is the inclusion for the first time of IT equipment
failure rate estimates based on inlet air temperature. These server failure rates are the
result of IT OEMs evaluating field data, such as warranty returns, as well as compo-
nent reliability data. These data will allow data center operators to weigh the poten-
tial reliability consequences of operating in various environmental conditions versus
the cost and energy consequences.
A cornerstone idea carried over from previous editions is that inlet temperature
is the only temperature that matters to IT equipment. Although there are reasons to
want to consider the impact of equipment outlet temperature on the hot aisle, it does
not impact the reliability or performance of the IT equipment. Also, each manufac-
turer balances design and performance requirements when determining their equip-
ment design temperature rise. Data center operators should expect to understand the
x Preface to the Third Edition
equipment inlet temperature distribution throughout their data centers and take steps
to monitor these conditions. A facility designed to maximize efficiency by aggres-
sively applying new operating ranges and techniques will require a complex, multi-
variable optimization performed by an experienced data center architect.
Although the vast majority of data centers are air cooled at the IT load, liquid
cooling is becoming more commonplace and likely will be adopted to a greater
extent due to the enhanced operational efficiency, potential for increased density,
and opportunity for heat recovery. Consequently, the third edition of Thermal Guide-
lines for Data Processing Environments for the first time includes definitions of
liquid-cooled environmental classes and descriptions of their applications. Even a
primarily liquid-cooled data center may have air-cooled IT within. As a result, a
combination of an air-cooled and liquid-cooled classes will typically be specified.
Contents
Preface to the Third Edition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
Chapter 1—Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Document Flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 Primary Users of This Document . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 Compliance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4 Definitions and Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Over the years, the power density of electronic equipment has steadily increased, as
shown and projected in Figure 1.1. In addition, the mission critical nature of
computing has sensitized businesses to the health of their data centers. The combi-
nation of these effects makes it obvious that better alignment is needed between
equipment manufacturers and facility operations personnel to ensure proper and
fault-tolerant operation within data centers.
This need was recognized by an industry consortium in 1999 that began a grass-
roots effort to provide a power density road map and to work toward standardizing
power and cooling of the equipment for seamless integration into the data center. The
Industry Thermal Consortium produced the first project of heat density trends. The
IT subcommittee of ASHRAE Technical Committee (TC) 9.9 is the successor of that
industry consortium. Figure 1.1 shows the latest projection from the IT subcommittee
and is based on best estimates of heat release for fully configured systems. An updated
set of power trend charts is published in the second edition of Datacom Equipment
Power Trends and Cooling Applications, Second Edition (ASHRAE 2012). These
updated equipment power trends extend to 2020 as shown for the 1U server, in
Figure 1.2.
The objective of Thermal Guidelines for Data Processing Environments, Third
Edition is to
Figure 1.2 1U server trends showing 2005 and new 2011 projections
(ASHRAE 2012).
Thermal Guidelines for Data Processing Environments, Third Edition 3
Unless otherwise stated, the thermal guidelines in this document refer to data
center and other data-processing environments. Telecom central offices are discussed
in detail in Telcordia NEBS1™ documents GR-63-CORE (2012) and GR3028-
CORE (2001), as well as ANSI T1.304 (1997) and the European ETSI standards
(1994, 1999). The NEBS documents are referenced when there is a comparison
between data centers and telecom rooms. These two equipment environments have
historically been very different. Nevertheless, it is important to show the comparison
where some convergence in these environments may occur in the future.
1.3 COMPLIANCE
It is the hope of TC 9.9 that many equipment manufacturers and facilities managers
will follow the guidance provided in this document. Data center facilities managers
can be confident that these guidelines have been produced by the IT manufacturers.
Manufacturers can self-certify the compliance of specific models of equipment
as intended for operation in data processing Air-Cooling Environmental Classes A1,
A2, A3, A4, B, and C and Liquid Cooling Environmental Classes W1–W5.
room load capacity: the point at which the equipment heat load in the room no longer
allows the equipment to run within the specified temperature requirements of the
equipment; Chapter 4, “Facility Temperature and Humidity Measurement,” defines
where these temperatures are measured. The load capacity is influenced by many
factors, the primary factor being the room theoretical capacity; other factors, such
as the layout of the room and load distribution, also influence the room load capacity.
room theoretical capacity: the capacity of the room based on the mechanical room
equipment capacity; this is the sensible capacity in kW (tons) of the mechanical
room for supporting the computer or telecom room heat loads.
temperature:
dew point: the temperature at which water vapor has reached the saturation
point (100% RH).
wet bulb: the temperature indicated by a psychrometer when the bulb of one
thermometer is covered with a water-saturated wick over which air is caused to
flow at approximately 4.5 m/s (900 ft/min) to reach an equilibrium temperature
of water evaporating into air, where the heat of vaporization is supplied by the
sensible heat of the air.
x-factor: a dimensionless metric that measures the relative hardware failure rate at
a given constant equipment inlet dry-bulb temperature when compared to a baseline
of the average hardware failure rate at a constant equipment inlet dry-bulb temper-
ature of 20°C (68°F).
2.1 BACKGROUND
TC 9.9 created the original publication Thermal Guidelines for Data Processing
Environments in 2004 (ASHRAE 2004). At the time, the most important goal was
to create a common set of environmental guidelines that ITE would be designed to
meet. Although computing efficiency was important, performance and availability
took precedence. Temperature and humidity limits were set accordingly. Progress-
ing through the first decade of the 21st century, increased emphasis has been placed
on computing efficiency. Power usage effectiveness (PUE) has become the new
metric by which to measure the effect of design and operation on data center effi-
ciency. To improve PUE, free-cooling techniques, such as air- and water-side econ-
omization, have become more commonplace with a push to use them year-round. To
enable improved PUE capability, TC 9.9 created additional environmental classes,
along with guidance on the use of the existing and new classes. Expanding the capa-
bility of ITE to meet wider environmental requirements can change the equipment’s
reliability, power consumption, and performance capabilities; this third edition of
the book provides information on how these capabilities are affected.
In the second edition of Thermal Guidelines (ASHRAE 2008), the purpose of the
recommended envelope was to give guidance to data center operators on maintaining
high reliability and also operating their data centers in the most energy efficient
manner. This envelope was created for general use across all types of businesses and
conditions. However, different environmental envelopes may be more appropriate for
different business values and climate conditions. Therefore, to allow for the potential
to operate in a different envelope that might provide even greater energy savings, this
third edition provides general guidance on server metrics that will assist data center
operators in creating an operating envelope that matches their business values. Each
of these metrics is described. Through these guidelines, the user will be able to deter-
mine what environmental conditions best meet their technical and business needs.
Any choice outside of the recommended region will be a balance between the addi-
tional energy savings of the cooling system versus the deleterious effects that may be
created on total cost of ownership (TCO) (total site energy use, reliability, acoustics,
10 Environmental Guidelines for Air-Cooled Equipment
the common notion that colder is better. Most data centers deployed ITE from multi-
ple vendors. This resulted in designing for ambient temperatures based on the ITE
with the most stringent temperature requirements (plus a safety factor). TC 9.9
obtained informal consensus from the major commercial ITE manufacturers for
both recommended and allowable temperature and humidity ranges and for four
environmental classes, two of which were applied to data centers.
Another critical accomplishment of TC 9.9 was to establish ITE air inlets as the
common measurement point for temperature and humidity compliance; require-
ments in any other location within the data center were optional.
The environmental guidelines/classes are really the domain and expertise of IT
OEMs. TC 9.9’s IT subcommittee is exclusively composed of engineers from
commercial IT manufacturers; the subcommittee is strictly technical.
The commercial IT manufacturers’ design, field, and failure data are shared (to
some extent) within the IT subcommittee, which enables greater levels of disclosure
and ultimately lead to the decision to expand the environmental specifications. Prior
to TC 9.9, there were no organizations or forums to remove the barriers to sharing
information among competitors. This is critical because some manufacturers
conforming while others do not results in a multivendor data center where the most
stringent requirement plus a safety factor would most likely prevail.
From an end-user perspective, it is also important that options, such as the
following, are provided for multivendor facilities:
The industry needs both types of equipment but also needs to avoid having
Option 2 inadvertently increase the acquisition cost of Option 1 by increasing
purchasing costs through mandatory requirements not desired or used by all end
users. Expanding the temperature and humidity ranges can increase the physical size
of the ITE (e.g., more heat-transfer area required), increase ITE airflow, etc. This can
impact embedded energy cost, power consumption, and ITE purchase cost but
enables peak performance under high-temperature operation.
By adding new classes and not mandating all servers conform to something such
as an air inlet temperature of 40°C (104°F), the increased server packaging cost for
energy optimization becomes an option rather than a mandate. Before the new
classes are described and compared to their 2008 version, several key definitions
need to be highlighted.
recommended environmental range: Facilities should be designed to achieve,
under normal circumstances, ambient conditions that fall within the recommended
range. This recommended range may be as defined either in Table 2.3 or by the
process outlined later in this chapter whereby the user can apply the metrics in
Figure 2.1 (described in more detail in this book) to define a different recommended
range more appropriate for particular business objectives.
12 Environmental Guidelines for Air-Cooled Equipment
The primary difference between the first edition of the Thermal Guidelines
published in 2004 and the second edition, published in 2008, were in the changes to
the recommended envelope shown in the Table 2.1. The 2008 recommended guide-
lines have not changed in the third edition, as shown in Table 2.3. However, as stated
above there is an opportunity to define a different recommended envelope based on
the metrics shown in Figure 2.1 and documented later in this chapter. More infor-
mation is provided later on this subject. For more background on the changes made
in 2008 to the recommended envelope, refer to Appendix A.
To enable improved operational efficiency, ASHRAE TC 9.9 has added two new
ITE environmental classes to Thermal Guidelines that are more compatible with
chillerless cooling. The naming conventions have been updated to better delineate the
types of ITE. Old and new classes are now specified differently (comparisons are
shown in Table 2.2). The 2008 version shows two data center classes (Classes 1 and
2) which have been kept the same in this update but are now referred to as Classes A1
and A2. The new data center classes are referred to as Classes A3 and A4.
Enterprise servers,
A1 1 Tightly controlled
storage products
A2 2 Data center Volume servers, storage
A3 NA products, personal Some control
computers, workstations
A4 NA
products typically designed for this environment are enterprise servers and stor-
age products.
Class A2/A3/A4: Typically an information technology space with some control
of environmental parameters (dew point, temperature, and RH); types of prod-
ucts typically designed for this environment are volume servers, storage prod-
ucts, personal computers, and workstations. Among these 3 classes A2 has the
narrowest temperature and moisture requirements and A4 has the widest envi-
ronmental requirements.
Class B: Typically an office, home, or transportable environment with minimal
control of environmental parameters (temperature only); types of products typi-
cally designed for this environment are personal computers, workstations,
laptops, and printers.
Class C: Typically a point-of-sale or light industrial or factory environment
with weather protection, sufficient winter heating and ventilation; types of
14 Environmental Guidelines for Air-Cooled Equipment
Table 2.3 2011 Thermal Guidelines—SI Version (I-P Version in Appendix B)
Equipment Environment Specifications for Air Cooling
Product Operationb,c Product Power Offc,d
Maximum
Humidity Maximum Maximum Dry-Bulb Maximum
Dry-Bulb Rate RH,
Class a Range, Dew Point, Elevatione,j, Temperature, Dew Point,
Temperaturee,g, °C of Changef, %
Noncondensingh,i °C m °C °C
°C/h
Recommended (Suitable for all 4 classes; explore data center metrics in this paper for conditions outside this range.)
5.5°C DP to
A1 to A4 18 to 27 60% RH
and 15°C DP
Allowable
For comparison, the NEBS and ETSI Class 3.1 environmental specifications are
listed in Table 2.4.
The European Telecommunications Standards Institute (ETSI) defines stan-
dards for information and communications technologies and is recognized by the
European Union as a European standards organization. ETSI has defined a set of five
environmental classes based on the end-use application. ETSI Classes 3.1 and 3.1e
apply to telecommunications centers, data centers and similar end-use locations.
These classes assume a noncondensing environment, no risk of biological or animal
contamination, normal levels of airborne pollutants, insignificant vibration and
shock, and that the equipment is not situated near a major source of sand or dust.
Classes 3.1 and 3.1e apply to permanently temperature-controlled enclosed loca-
tions where humidity is not usually controlled. A high-level summary of Classes 3.1
and 3.1e is given in Table 2.5 along with a climatogram of those same conditions
(Figure 2.2).
For more details on the Class 3.1 and 3.1e specification requirements, please
consult ETS 300 019-1-3 (ETSI 2009).
The new guidelines (Table 2.3) were developed with a focus on providing as
much information as possible to the data center operators, allowing them to maxi-
mize energy efficiency without sacrificing the reliability required by their business.
The allowable environmental ranges for the four data center classes, including the
two new ones (A3 and A4), are shown in Figures 2.3 and 2.4 (SI and I-P, respec-
tively). Derating to maximum temperature must be applied as previously described.
Class A3 expands the temperature range to 5°C to 40°C (41°F to 104°F) while
also expanding the moisture range to extend from a low moisture limit of 8% RH and
–12°C (10.4°F) dew point to a high moisture limit of 85% RH.
Class A4 expands the allowable temperature and moisture range even further than
Class A3. The temperature range is expanded to 5°C to 45°C (41°F to 113°F), while
the moisture range extends from 8% RH and –12°C (10.4°F) dew point to 90% RH.
Based on the allowable lower moisture limits for Classes A3 and A4, there are
some added minimum requirements that are listed in note “i” in the table that pertain
to the protection of the equipment from electrostatic discharge (ESD) failure-inducing
events that could possibly occur in low-moisture environments.
The recommended envelope is highlighted as a separate row in Table 2.3
because of some misconceptions regarding the use of the recommended envelope.
When it was first created, it was intended that within this envelope the most reliable,
acceptable and reasonably power-efficient operation could be achieved. Data from
the manufacturers were used to create the recommended envelope. It was never
intended that the recommended envelope would represent the absolute limits of inlet
air temperature and humidity for ITE. As stated in the second edition of Thermal
Guidelines, the recommended envelope defined the limits under which ITE would
operate most reliably while still achieving reasonably energy-efficient data center
operation. However, in order to use economizers as much as possible to save energy
Table 2.4 NEBS Environmental Specifications
Equipment Environmental Specifications for Air Cooling
Product Operationb,c Product Power Offc,d
NEBSa 5 to 40b,c,d 5% to 85% RHb,d 28b 4000b N/A N/A N/A N/A
a The product operation values given for NEBS are from GR-63-CORE (Telcordia 2006) and GR-3028-CORE (Telcordia 2001) GR-63-CORE also addresses conformance testing of new
equipment for adequate robustness Some of the test conditions are summarized below For complete test details, please review GR-63-CORE Conformance test conditions (short-term) of
new equipment:
Dry-Bulb Temperature
Frame Level: –5°C to 50°C, 16 hours at –5°C, 16 hours at 50°C, (GR-63-CORE)
Shelf Level: –5°C to 55°C, 16 hours at –5°C, 16 hours at 55°C, (GR-63-CORE)
Max Rate of Change: 96°C/h warming and 30°C/h cooling (GR-63-CORE and GR-3028-CORE)
RH: 5 to 90% 3 hours at <15% RH, 96 hours at 90% RH (GR-63-CORE)
Max Dew Point: 28°C (GR-63-CORE)
b Requirements for continuous operating conditions that new equipment will tolerate (GR-63-CORE) A feature or function that, in the view of Telcordia, is necessary to satisfy the needs of
a typical client company is labeled “Requirement” and is flagged by the letter “R ” The conformance testing described in footnote “j” is designed to ensure that equipment tolerates the
specified continuous operating conditions
c Derate maximum dry-bulb temperature 10°C at and above 1,800 m
d Also ANSI T1 304-1997 (ANSI 1997)
e Recommended facility operation per GR-3028-CORE No NEBS requirements exist
f Generally accepted telecom practice; the major regional service providers have shut down almost all humidification based on Telcordia research Personal grounding is strictly enforced
to control ESD failures No NEBS requirements exist
18 Environmental Guidelines for Air-Cooled Equipment
a With minimum absolute humidity of no less than 1 5 g/m3 (0 04 g/ft3) and a maximum absolute humidity
of no more than 20 g/m3 (0 57 g/ft3)
b With minimum absolute humidity of no less than 1 g/m3 (0 03 g/ft3)
c With maximum absolute humidity of no more than 25 g/m3 (0 71 g/ft3)
Note: Maximum rate of temperature change for continuous operation, Class 3 1 and Class 3 1e is 0 5°C/min
(0 9°F/min) averaged over a period of 5 minutes
Figure 2.2 Climatogram of the ETSI Class 3.1 and 3.1e environmental
conditions (ETSI 2009).
Thermal Guidelines for Data Processing Environments, Third Edition 19
during certain times of the year, the inlet server conditions may fall outside the
recommended envelope but still within the allowable envelope. The second edition
of Thermal Guidelines also states that it is acceptable to operate outside the recom-
mended envelope for short periods of time without risk of affecting the overall reli-
ability and operation of the ITE. However, some still felt the recommended envelope
was mandatory, even though that was never the intent.
Equipment inlet air temperature measurements are specified in Chapter 4.
However, to aid in data center layout and inlet rack temperature monitoring, manu-
facturers of electronic equipment should include temperature sensors within their
equipment that monitor and display or report the inlet air temperature. For product
operation, the environmental specifications given in Table 2.3 refer to the air enter-
ing the electronic equipment. Air exhausting from electronic equipment is not rele-
vant to the manufacturers of such equipment. However, the exhaust temperature is
a concern, for example, for service personnel working in the hot exhaust airstream.
Some information and guidance from OSHA is given in Appendix E for personnel
working in high-temperature environments.
The allowable and recommended envelopes for Classes A1–A4, B, C, and
NEBS are depicted in psychrometric charts beginning with Appendix F. The recom-
mended operating environment specified in Table 2.3 is based in general on reliabil-
ity aspects of the electronic hardware:
1. High RH levels have been shown to affect failure rates of electronic compo-
nents. Examples of failure modes exacerbated by high RH include conductive
anodic failures, hygroscopic dust failures, tape media errors and excessive
wear, and corrosion. The recommended upper RH limit is set to limit this effect.
2. Electronic devices are susceptible to damage by electrostatic discharge, while
tape products and media may have excessive errors in rooms that have low RH.
The recommended low RH limit is set to limit this effect.
3. High temperature affects the reliability and life of electronic equipment. The
recommended upper ambient temperature limit is set to limit these temperature-
related reliability effects.
4. The lower the temperature in the room that houses the electronic equipment, the
more energy is required by the HVAC equipment. The recommended lower
ambient temperature limit is set to limit extreme overcooling.
For data center equipment, each individual manufacturer tests to specific envi-
ronmental ranges, and these may or may not align with the allowable ranges spec-
ified in Table 2.3.
For telecommunications equipment, the test environment ranges (short-term) are
specified in NEBS GR-63-CORE (Telcordia 2012), together with the allowable oper-
ating conditions (long-term or continuous). To ensure adequate robustness for tele-
communications equipment, the test ranges for new equipment are based on
environmental conditions that may—with a certain low probability—occur in various
telecommunications environments. These limits shall not be considered continuous
Thermal Guidelines for Data Processing Environments, Third Edition 21
1. Consider the state of best practices for the data center. Most best practices,
including airflow management and cooling-system controls strategies, should
be implemented prior to the adoption of higher server inlet temperature.
Thermal Guidelines for Data Processing Environments, Third Edition 23
The steps above provide a simplified view of the flowchart in Appendix C. The
use of Appendix C is highly encouraged as a starting point for the evaluation of the
options. The flowchart provides guidance to the data center operators seeking to
minimize TCO on how best to position their data center for operating in a specific
environmental envelope. Possible endpoints range from optimization of TCO within
the recommended envelope as specified in the 2008 version of the ASHRAE enve-
lopes (see Table 2.1) to a chillerless data center using any of the data center classes.
More importantly, Appendix C describes how to achieve even greater energy savings
through the use of a TCO analysis using the servers metrics provided in the next
sections.
of an increase in leakage current for some silicon devices. As an example of the use
of Figure 2.6, if a data center is normally operating at a server inlet temperature of
15°C (59°F) and the operator wants to raise this temperature to 30°C (86°F), it could
be expected that the server power would increase in the range of 3% to 7%. If the
inlet temperature increases to 35°C (95°F), the ITE power could increase in the
range of 7% to 20% compared to operating at 15°C (59°F).
26 Environmental Guidelines for Air-Cooled Equipment
Figure 2.7 Server flow rate increase versus ambient temperature increase.
Since very few data center class products currently exist for the Class A3 envi-
ronment, the development of the Class A3 envelope shown in Figure 2.6 was simply
extrapolated from the Class A2 trend. (Note: Equipment designed for NEBS envi-
ronments would likely meet the new class requirements but typically have limited
features and performance in comparison with volume ITE). New products for this
class would likely be developed with improved heat sinks and/or fans to properly cool
the components within the new data center class, so the power increases over the
wider range would be very similar to that shown for Class A2.
With the increase in fan speed over the range of ambient temperatures, ITE flow
rate also increases. An estimate of the increase in server airflow rates over the
temperature range up to 35°C (95°F) as displayed in Figure 2.7. It is very important
in designing data centers to take advantage of temperatures above the 25°C to 27°C
(77°F to 80.6°F) inlet ambient temperature range. With higher temperatures as an
operational target the design must be analyzed for capability of handling the higher
volumes of airflow. This includes all aspects of the airflow system. The base system
may be called upon to meet 250% (per Figure 2.7) of the nominal airflow (airflow
when in the recommended range). This may include the outdoor air inlet, filtration,
cooling coils, dehumidification/humidification, fans, underfloor plenum, raised
floor tiles/grates, and containment systems. A detailed engineering evaluation of the
data center systems higher flow rate is a requirement to ensure successful operation
at elevated inlet temperatures.
25°C (77°F) 30°C (86°F) 35°C (95°F) 40°C (104°F) 45°C (113°F)
increased over the years and have become, or at least will soon become, a serious
concern to data center managers and owners. For background and discussion on this,
see Chapter 9 “Acoustical Noise Emissions” in ASHRAE’s Design Considerations
for Datacom Equipment Centers, Second Edition (2009b). The following section
addresses ITE noise as opposed to total data center noise, which would include
computer room cooling noise sources, which also contribute to overall data center
noise exposure. The increase in noise levels is the obvious result of the significant
increase in cooling requirements of modern IT and telecommunications equipment.
The increase of concern results from noise levels in data centers approaching or
exceeding regulatory workplace limits, such as those imposed by OSHA (1980) in
the U.S. or by EC Directives in Europe (Europa 2003). TELCO equipment level
sound power requirements are specified in GR-63-CORE (Telcordia 2012). Empir-
ical fan laws generally predict that the sound power level of an air-moving device
increases with the fifth power of rotational speed, and this behavior has generally
been validated over the years for typical high-end rack-mounted servers, storage
units, and I/O equipment normally found on data center floors. This means that a 20%
increase in speed (e.g., 3000 to 3600 rpm) equates to a 4 dB increase in noise level.
The first part of the second edition of Thermal Guidelines addressed raising the
recommended operating temperature envelope by 2°C (3.6°F) (from 25°C to 27°C
[77°F to 80.6°F]). While it is not possible to predict a priori the effect on noise levels
of a potential 2°C (3.6°F) increase in data center temperatures, it is not unreasonable
to expect to see increases in the range of 3 to 5 dB for such a rise in ambient temper-
atures as a result of the air-moving devices speeding up to maintain the same cooling
effect. Data center managers and owners should therefore weigh the trade-offs
between the potential benefits in energy efficiency with this original, proposed new
recommended operating environment and the potential risks associated with
increased noise levels.
Because of the new ASHRAE air-cooled-equipment guidelines described in
this chapter, specifically the addition of Classes A3 and A4 with widely extended
operating temperature envelopes, it becomes instructive to look at the allowable
upper temperature ranges and their potential effect on data center noise levels. Using
the fifth power empirical law mentioned above, coupled with current practices for
increasing air-moving device speeds based on ambient temperature, the A-weighted
sound power level increases shown in Table 2.7 were predicted for typical air-cooled
high-end server racks containing a mix of compute, I/O, and water-cooled units.
Of course, the actual increase in noise level for any particular ITE rack depends
not only on the specific configuration of the rack but also on the cooling schemes and
fan speed algorithms used for the various rack drawers and components. Differences
28 Environmental Guidelines for Air-Cooled Equipment
would exist between high-end equipment that employs sophisticated fan speed
control and entry-level equipment using fixed fan speeds or rudimentary speed
control. However, the above increases in noise emission levels with ambient temper-
ature can serve as a general guideline for data center managers and owners
concerned about noise levels and noise exposure for employees and service person-
nel. The IT industry has developed its own internationally standardized test codes
for measuring the noise emission levels of its products (ISO 7779 [2010]) and for
declaring these noise levels in a uniform fashion (ISO 9296 [1988]). Noise emission
limits for ITE installed in a variety of environments (including data centers) are
stated in Statskontoret Technical Standard 26:6 (2004).
The above discussion applies to potential increases in noise emission levels, i.e.,
the sound energy actually emits from the equipment, independent of listeners in the
room or the environment in which the equipment is located. Ultimately, the real
concern is about the possible increase in noise exposure, or noise emission levels,
experienced by personnel in the data center. With regard to the regulatory workplace
noise limits and protection of employees against potential hearing damage, data
center managers should check whether potential changes in noise levels in their envi-
ronment will cause them to trip various action level thresholds defined in the local,
state, or national codes. The actual regulations should be consulted, as they are
complex and beyond the scope of this document to explain in full. The noise levels
of concern in workplaces are stated in terms of A-weighted sound pressure levels (as
opposed to A-weighted sound power levels used above for rating the emission of
noise sources). For instance, when noise levels in a workplace exceed a sound pres-
sure level of 85 dB(A), hearing conservation programs, which can be quite costly,
are mandated, generally involving baseline audiometric testing, noise level moni-
toring or dosimetry, noise hazard signage, and education and training. When noise
levels exceed 87 dB(A) (in Europe) or 90 dB(A) (in the U.S.), further action, such
as mandatory hearing protection, rotation of employees, or engineering controls,
must be taken. Data center managers should consult with acoustical or industrial
hygiene experts to determine whether a noise exposure problem will result when
ambient temperatures are increased to the upper ends of the expanded ranges
proposed in this book.
In an effort to provide some general guidance on the effects of the proposed
higher ambient temperatures on noise exposure levels in data centers, the following
observations can be made (though, as noted above, it is advised that one seek profes-
sional help in actual situations, because regulatory and legal requirements are at
issue). Modeling and predictions of typical ITE racks in a typical data center with
front-to-back airflow, have shown that the sound pressure level in the center of a typi-
cal aisle between two rows of continuous racks will reach the regulatory trip level
of 85 dB(A) when each of the individual racks in the rows has a measured (as
opposed to a statistical upper limit) sound power level of roughly 8.4 B (84 dB sound
power level). If it is assumed that this is the starting condition for a 25°C (77°F)
ambient data center temperature—and many fully configured high-end ITE racks
today are at or above this 8.4-B level—the sound pressure level in the center of the
aisle would be expected to increase to 89.7 dB(A) at 30°C (86°F) ambient, to 91.4
Thermal Guidelines for Data Processing Environments, Third Edition 29
dB(A) at 35°C (95°F) ambient, to 93.4 dB(A) at 40°C (104°F) ambient, and to 97.9
dB(A) at 45°C (113°F) ambient, using the predicted increases to sound power level
above. Needless to say, these levels are extremely high. They are not only above the
regulatory trip levels for mandated action (or fines, in the absence of action), but they
clearly pose a risk of hearing damage unless controls are instituted to avoid exposure
by data center personnel.
Since there are so many different variables and scenarios to consider for ITE
reliability, the approach taken by TC 9.9 was to initially establish a baseline failure
rate (x-factor) of 1.00 that reflected the average probability of failure under a
constant ITE inlet temperature of 20°C (68°F). Table 2.8 provides x-factors at other
constant ITE inlet temperatures for 7 × 24 × 365 continuous operation conditions.
The key to applying the x-factors in Table 2.8 is to understand that it represents a
relative failure rate compared to the baseline of a constant ITE inlet temperature of
20°C (68°F). Finally, Table 2.8 provides x-factor data at the average, upper, and
lower bounds. This table was created using manufacturer’s reliability data, which
included all components within the volume server package. In addition, because
there are many variations within a server package among the number of processors,
memory DIMMs, hard drives, etc., the resulting table contains lower and upper
bounds to take account of these. The data set that one opts to use should depend on
the level of risk tolerance for a given application.
It is important to note that the 7 × 24 × 365 use conditions corresponding to the
x-factors in Table 2.8 are not a realistic reflection of the three economization scenar-
ios outlined above. For most climates in the industrialized world, the majority of the
hours in a year are spent at cool temperatures, where mixing cool outdoor air with
air from the hot aisle exhaust keeps the data center temperature in the range of 15°C
to 20°C (59°F to 68°F) (x-factor of 0.72 to 1.00). Furthermore, these same climates
spend only 10% to 25% of their annual hours above 27°C (80.6°F), the upper limit
of the ASHRAE recommended range. The correct way to analyze the reliability
impact of economization is to use climate data to construct a time-weighted average
x-factor. An analysis of time-weighted x-factors will show that, even for the harshest
economization scenario (chillerless), the reliability impact of economization is
much more benign than the 7 × 24 × 365 x-factor data in Table 2.8 would indicate.
A summary of time-weighted x-factors for air-side economization for a variety of
U.S. cities is shown in Figure 2.8. The data assume a 1.5°C (2.7°F) temperature rise
Thermal Guidelines for Data Processing Environments, Third Edition 31
between the outdoor air temperature and the equipment inlet air temperature. More
than half of the cities have x-factor values at or below 1.0, and even the warmest cities
shows an x-factor of only about 1.25 relative to a traditional air-conditioned data
center that is kept at 20°C (68°F).
For more information on how to analyze the reliability impact of climate data,
refer to Appendix H. For detailed analyses and bar charts of time-weighted averages
for U.S. and world-wide cities see Appendix I.
It is important to be clear what the relative failure rate values mean. We have
normalized the results to a data center run continuously at 20°C (68°F), which has
the relative failure rate of 1.0. For those cities with values below 1.0, the implication
is that the economizer still functions and the data center is cooled below 20°C (68°F)
(to 15°C [59°F]) for those hours each year. In addition, the relative failure rate shows
the expected increase in the number of failed servers, not the percentage of total serv-
ers failing; e.g., if a data center that experiences 4 failures per 1000 servers incor-
porates warmer temperatures, and the relative failure rate is 1.2, then the expected
failure rate would be 5 failures per 1000 servers. To provide an additional frame of
reference on data center hardware failures, sources showed blade hardware server
failures were in the range of 2.5% to 3.8% over 12 months in two different data
32 Environmental Guidelines for Air-Cooled Equipment
centers with supply temperatures approximately 20°C (68°F) (Patterson et al. 2009;
Intel 2008). In a similar data center that included an air-side economizer with
temperatures occasionally ranging to 35°C (95°F) (at an elevation around 1600 m
[5250 ft]), the failure rate was 4.5%. These values are provided solely for guidance
with an example of failure rates. In these studies, a failure was deemed to have
occurred each time a server required hardware attention. No attempt to categorize
the failure mechanisms was made.
To provide additional guidance on the use of Table 2.8, Appendix H gives a prac-
tical example of the impact of a compressorless cooling design on hardware failures,
and Appendix I provides ITE reliability data for selected major U.S. and global cities.
FR4) provides the electrical isolation between board signals. With either increased
moisture in the PCB or higher temperature within the PCB, transmission line losses
increase. Signal integrity may be significantly degraded as the board’s temperature
and moisture content increase. Moisture content changes relatively slowly, on the
order of hours and days, based on the absorption rate of the moisture into the board.
Outer board layers are affected first, and longer-term moisture exposure affect these
layers first. Temperature changes on the order of minutes and can quickly affect
performance. As more high-speed signals are routed in the PCB, both temperature
and humidity will become even greater concerns for ITE manufacturers. The cost of
PCB material may increase significantly and may increase the cost of Classes A3 and
A4 rated ITE. The alternative for the ITE manufacturer is to use lower speed bus
options, which will lower performance.
Excessive exposure to high humidity can induce performance degradations or
failures at various circuitry levels. At the PCB level, conductive anodic filament
grows along the delaminated fiber/epoxy interfaces where moisture facilitates the
formation of a conductive path (Turbini and Ready 2002; Turbini et al. 1997). At the
substrate level, moisture can cause surface dendrite growth between pads of opposite
bias due to electrochemical migration. This is a growing concern due to continuing
C4 pitch refinement. At the silicon level, moisture can induce degradation or loss of
the adhesive strength in the dielectric layers, while additional stress can result from
hygroscopic swelling in package materials. The combination of these two effects
often causes delamination near the die corner region where thermal-mechanical stress
is inherently high and more vulnerable to moisture. It is worth noting that temperature
plays an important role in moisture effects. On one hand, higher temperature
increases the diffusivity coefficients and accelerates the electrochemical reaction. On
the other hand, the locally higher temperature due to self heating also reduces the local
RH, thereby drying out the circuit components and enhancing their reliability.
In addition to the above diffusion-driven mechanism, another obvious issue
with high humidity is condensation. This can result from sudden ambient tempera-
ture drop or the presence of a lower temperature source for water-cooled or refrig-
eration-cooled systems. Condensation can cause failures in electrical and
mechanical devices through electrical shorting and corrosion. Other examples of
failure mode exacerbated by high RH include hygroscopic dust failures (Comizzoli
et al. 1993), tape media errors, and excessive wear (Van Bogart 1995), and corrosion.
These failures are found in environments that exceed 60% RH for extended periods
of time.
As a rule, the typical mission-critical data center must give the utmost consid-
eration of the trade-offs before operating with an RH that exceeds 60% for the
following reasons:
• It is well known that moisture and pollutants are necessary for metals to
corrode. Moisture alone is not sufficient to cause atmospheric corrosion.
Pollution aggravates corrosion in the following ways:
• Corrosion products, such as oxides, may form and protect the metal
and slow down the corrosion rate. In the presence of gaseous pollutants
34 Environmental Guidelines for Air-Cooled Equipment
like SO2 and H2S and ionic pollutants like chlorides, the corrosion-
product films are less protective, allowing corrosion to proceed some-
what linearly. When the RH in the data center is greater than the deli-
quescent RH of the corrosion products, such as copper sulfate, cupric
chloride, and the like, the corrosion-product films become wet, dramat-
ically increasing the rate of corrosion. Cupric chloride, a common cor-
rosion product on copper, has a deliquescence RH of about 65%. A
data center operating with RH greater than 65% would result in the
cupric chloride absorbing moisture, becoming wet, and aggravating
copper corrosion rate.
• Dust is ubiquitous. Even with our best filtration efforts, fine dust will be pres-
ent in a data center and will settle on electronic hardware. Fortunately, most
dust has particles with high deliquescent RH, which is the RH at which the
dust absorbs enough water to become wet and promote corrosion and/or ion
migration. When the deliquescent RH of dust is greater than the RH in the
data center, the dust stays dry and does not contribute to corrosion or ion
migration. However on the rare occurrence when the dust has deliquescent
RH lower than the RH in the data center, the dust will absorb moisture,
become wet, and promote corrosion and/or ion migration, degrading hardware
reliability. A study by Comizzoli et. al. (1993) showed that, for various loca-
tions worldwide, leakage current due to dust that had settled on printed circuit
boards increased exponentially with RH. This study leads us to the conclusion
that maintaining the RH in a data center below about 60% will keep the leak-
age current from settled fine dust in the acceptable subangstrom range.
potential reduction in MTBF is directly related to the level of the elevated temper-
atures. Diligent management of elevated temperatures to minimize event duration
should minimize any residual effect on MTBF. The guidance provided here should
allow the user to quantify the ITE failure rate impact of both their economization
scenarios and the climate where their data center facility is located. Refer to Appen-
dix F for specific recommended and allowable temperature limits.
The reasons for the original recommended envelope have not gone away. Oper-
ation at wider extremes will have energy and/or reliability impacts. A compressor-
less data center could actually provide better reliability than its tightly controlled
counterpart.
3
Environmental Guidelines for
Liquid-Cooled Equipment
Chapter 2 documented the expanded data center environmental guidelines by
adding two more envelopes that are wider in temperature and humidity. However,
those guidelines are for air-cooled IT equipment (ITE) and do not address water
temperatures provided by facilities for supporting liquid-cooled equipment (here
“liquid-cooled ITE” refers to any liquid, such as water, refrigerant, dielectric, etc.,
within the design control of the IT manufacturers). TC 9.9 published Liquid Cooling
Guidelines for Datacom Equipment Centers in 2006 (ASHRAE 2006), which
focused mostly on the design options for liquid-cooled equipment and did not
address the various facility water temperature ranges possible for supporting liquid-
cooled equipment. This chapter describes the classes for the temperature ranges of
the facility supply of water to liquid-cooled ITE. The location of this interface is the
same as that defined in Liquid Cooling Guidelines and detailed in Chapter 4 of that
book. In addition, Chapter 3 of this book and several appendices reinforce some of
the information provided in Liquid Cooling Guidelines on the interface between the
ITE and infrastructure in support of the liquid-cooled ITE. Since the classes cover
a wide range of facility water temperatures supplied to the ITE, a brief description
is provided for the possible infrastructure equipment that could be used between the
liquid-cooled ITE and the outdoor environment.
The global interest in expanding the temperature and humidity ranges for air-
cooled ITE continues to increase, driven by the desire to achieve higher data center
operating efficiency and lower total cost of ownership (TCO). For these reasons,
liquid cooling of ITE can provide high performance while achieving high energy effi-
ciency in power densities beyond air-cooled equipment and simultaneously enabling
use of waste heat when supply facility water temperatures are high enough. Chapter 3
specifies the environmental classes for the temperature of water supplied to ITE.
By creating these new facility water-cooling classes and not mandating use of
a specific class, TC 9.9 provides server manufacturers the ability to develop products
for each class depending on customer needs and requirements.
Developing these new classes for the commercial IT manufacturers, in consul-
tation with the Energy Efficient High Performance Computing (EE HPC) Working
Group should produce better results, since the sharing of critical data has resulted
in broader environmental specifications than would otherwise be possible.
facilities that house them. To meet this challenge, the use of direct water or refrig-
erant cooling at the rack or board level is now being deployed. The ability of water
and refrigerant to carry much larger amounts of heat per volume or mass also offers
tremendous advantages. The heat from these liquid-cooling units is in turn rejected
to the outdoor environment by using either air or water to transfer heat out of the
building, or, in some facilities, it is used for local space heating. Because of the oper-
ating temperatures involved with liquid-cooling solutions, water-side economiza-
tion fits in well.
Liquid cooling can also offer advantages in terms of lower noise levels and close
control of electronics temperatures. However, liquid in electronic equipment raises
a concern about leaks. This is an issue because maintenance, repair, and replacement
of electronic components result in the need to disconnect and reconnect the liquid
carrying lines. To overcome this concern, IT OEM designers sometimes use a
nonconductive liquid, such as a refrigerant or a dielectric fluid, in the cooling loop
for the ITE.
In the past, high-performance mainframes were often water-cooled, and the
internal piping was supplied by the IT OEM. Components available today have simi-
lar factory-installed and leak-tested piping that can accept the water from the
mechanical cooling system, which may also employ a water-side economizer.
Increased standardization of liquid-cooled designs for connection methods and loca-
tions will also help expand their use by minimizing piping concerns and allowing
interchangeability of diverse liquid-cooled IT products.
The choice to move to liquid cooling may occur at different times in the life of
a data center. There are three main times, discussed below, when the decision
between air and liquid cooling must be made. Water’s thermal properties were
discussed earlier as being superior to air. This is certainly the case but does not mean
that liquid cooling is invariably more efficient than air cooling. Both can be very effi-
cient or inefficient, and which is best generally has more to do with design and appli-
cation than the cooling fluid. In fact, modern air-cooled data centers with air
economizers are often more efficient than many liquid-cooled systems. The choice
of liquid-cooled versus air-cooled generally has more to do with factors other than
efficiency.
New Construction
In the case of a new data center, the cooling architect must consider a number of
factors, including data center workload, availability of space, location-specific
issues, and local climate. If the data center will have an economizer, and the climate
is best suited to air-side economizers because of mild temperatures and moderate
humidity, then an air-cooled data center may make the most sense. Conversely, if the
climate is primarily dry, then a water-side economizer may be ideal, with the cooling
fluid conveyed either to the racks or to a coolant distribution unit (CDU).
Liquid cooling more readily enables the reuse of waste heat. If a project is
adequately planned from the beginning, reusing the waste energy from the data
center may reduce energy use of the site or campus. In this case, liquid cooling is the
obvious choice because the heat in the liquid can most easily be transferred to other
Thermal Guidelines for Data Processing Environments, Third Edition 41
locations. Also, the closer the liquid is to the components, the higher the quality of
the heat will be that is recovered and available for alternate uses.
Expansions
Another common application for liquid cooling is adding or upgrading equipment
in an existing data center. Existing data centers often do not have large raised floor
heights, or the raised floor plenum is full of obstructions such as cabling. If a new
rack of ITE is to be installed that is of higher power density than the existing raised-
floor air-cooling can support, liquid cooling can be the ideal solution. Current typi-
cal air-cooled rack powers can range from 6 to 30 kW. In many cases, rack powers
of 30 kW are well beyond what legacy air cooling can handle. Liquid cooling to a
datacom rack, cabinet-mounted chassis, cabinet rear door, or other localized liquid-
cooling system can make these higher-density racks nearly room neutral by cooling
the exhaust temperatures down to room temperature levels.
High Density and HPC
Because of the energy densities found in many high-performance computing (HPC)
applications and next-generation high-density routers, liquid cooling can be a very
appropriate technology if the room infrastructure is in place to support it. One of the
main cost and performance drivers for HPC is the node-to-node interconnect.
Because of this, HPC typically is driven toward higher power density than is a typi-
cal enterprise or internet data center. Thirty-kilowatt racks are typical, with densities
extending as high as 80 to 120 kW. Without some implementation of liquid cooling,
these higher powers would be very difficult if not impossible to cool. The advan-
tages of liquid cooling increase as the load densities increase. More details on the
subject of liquid cooling can be found in Liquid Cooling Guidelines for Datacom
Equipment Centers (ASHRAE 2006).
Several implementations of liquid cooling may be deployed. The most common
are as follows:
Note that new options, such as single- and two-phase immersion baths, are
becoming available that may become more prevalent in the near future.
The CDU may be external to the datacom rack, as shown in Figure 3.1, or below
or within the datacom rack, as shown in Figure 3.2.
Figures 3.1 and 3.2 show the interfaces for a liquid-cooled rack with remote heat
rejection. The interface is located at the boundary at the facility water system loop
and does not impact the datacom equipment cooling system loops, which are
42 Environmental Guidelines for Liquid-Cooled Equipment
Figure 3.2 Combination air- and liquid-cooled rack or cabinet with internal
CDU.
TCS coolant directly to cold plates attached to DECS internal components in addi-
tion to or in place of a separate internal DECS. As seen in Figure 3.3, the water guide-
lines that are discussed in this document are at the chilled-water systems (CHWS)
loop. If chillers are not installed, then the guidelines would apply to the condenser
water systems (CWS) loop.
Although not specifically noted, a building-level CDU may be more appropriate
where there are a large number of racks connected to liquid cooling. In this case, the
location of the interface is defined the same as in Figure 3.1, but the CDU as shown
would be a building-level rather than a modular unit. Building level CDUs handling
many megawatts of power have been built for large HPC systems. Although
Figure 3.1 shows liquid cooling using a raised floor, liquid could be distributed
above ceiling just as efficiently.
W2 2 to 27 (35.6 to 80.6 )
Water-side economizer
W4 (with dry-cooler or cooling N/A 2 to 45 (35.6 to 113)
tower)
within the ITE. For IT designs that meet the higher supply temperatures, as refer-
enced by the ASHRAE classes in Table 3.1, enhanced thermal designs are required
to maintain the liquid-cooled components within the desired temperature limits.
Generally, the higher the supply water temperature is, the higher the cost of the cool-
ing solutions.
Class W1/W2: This is typically a data center that is traditionally cooled using chill-
ers and a cooling tower, but with an optional water-side economizer to improve
energy efficiency, depending on the location of the data center (see Figure 3.4).
Class W3: For most locations, these data centers may be operated without chillers.
Some locations will still require chillers (see Figure 3.4).
Class W4: These data centers are operated without chillers to take advantage of
energy efficiency and reduce capital expense (see Figure 3.5).
Class W5: Water temperature is high enough to make use of the water exiting the
ITE to heat local buildings in order to take advantage of energy efficiency, reduce
capital expense through chillerless operation, and also make use of the waste energy
(see Figure 3.6).
The facility supply water temperatures specified in Table 3.1 are requirements
to be met by the ITE for the specific class of hardware manufactured. For the data
center operator, the use of the full range of temperatures within the class may not be
required or even desirable given the specific data center infrastructure design.
There is currently no widespread availability of ITE in Classes W3–W5. Future
product availability in this range will be based on market demand. It is anticipated
that future designs in these classes may involve trade-offs between IT cost and
Thermal Guidelines for Data Processing Environments, Third Edition 45
performance. At the same time, these classes will allow lower-cost data center infra-
structure in some locations. The choice of IT liquid-cooling class should involve a
TCO evaluation of the combined infrastructure and IT capital and operational costs.
Figure 3.7 Typical water flow rates for constant heat load.
may be opportunities for heat recovery for building use even in Classes W3 and W4,
depending on the configuration and design specifications of the systems to which the
waste heat would be supplied.
pH 7 to 9
discussion of the mechanisms and chemistries involved. The most common prob-
lems in cooling systems are described in Appendix J.
Temperature and humidity ranges for NEBS spaces may differ from those
shown in Table 2.3. The specific NEBS ranges are specified in the GR-63-CORE
standard (Telcordia 2012). The facility supply water temperature identified in
Table 3-1 should be reviewed and adjusted as needed to assure compliance with the
GR-63 equipment inlet temperature parameters.
While the use of distributed refrigerant systems is supportive of NEBS envi-
ronments, deployment of these systems is not limited specifically to these environ-
ments. The use of distributed refrigerant or other dielectrics is suitable for
deployment in IT and other equipment spaces.
3.3.4 Connections
Refrigerant distribution systems use rigid copper piping compliant with ASTM
Type L/EN0157, Type Y, or Type ACR. Connections are required to be brazed, not
soldered, to ensure reliability. Type ACR is recommended for full interoperability
between manufacturers of distributed refrigerant systems.
Connections to the distribution infrastructure follow similar guidelines as to
water (ASHRAE 2006), with the ability to connect via a direct drop of rigid copper
or utilizing a quick connection port and associated flexible hose arrangement.
Thermal Guidelines for Data Processing Environments, Third Edition 51
The three tests listed above are hierarchical in nature, and the user should read
all of them prior to choosing the one that best fits their application. In some cases,
the proper test may be a mix of the above. For instance, a data center with low overall
power density but with localized high-density areas may elect to perform the test
listed in Section 4.1, “Facility Health and Audit Tests,” for the entire facility but also
perform the test listed in Section 4.2, “Equipment Installation Verification Tests,” for
the area with localized high power density.
The following sections outline the three recommended tests for measuring
temperature and humidity.
• Establish at least one point for every 3 to 9 m (10 to 30 ft) of aisle or every
fourth rack position, as shown in Figure 4.1.
• Locate points midway along the aisle, centered between equipment rows, as
shown in Figure 4.2.
• Where a hot aisle/cold aisle configuration is employed, establish points in
cold aisles only1, as shown in Figure 4.3
1. Hot-aisle temperature levels do not reflect equipment inlet conditions and, therefore, may
be outside the ranges defined in Table 2.3. Hot-aisle temperature levels may be measured
to help understand the facility, but significant temperature variation with measurement
location is normal.
Thermal Guidelines for Data Processing Environments, Third Edition 55
The objective of these measurements is to ensure that the aisle temperature and
humidity levels are all being maintained within the recommended operating condi-
tions of the class environment, as noted in Table 2.3 (see Chapter 2).
4.1.3 Evaluation
• Measure and record temperature and humidity at the geometric center of the
air intake of the top, middle, and bottom racked equipment at 50 mm
(approximately 2 in.) from the front of the equipment. For example, if there
are 20 servers in a rack, measure the temperature and humidity at the center
of the first, tenth or eleventh, and twentieth server. Figure 4.4 shows example
Thermal Guidelines for Data Processing Environments, Third Edition 57
monitoring points for configured racks For configurations with three pieces
of equipment or less per cabinet, measure the inlet temperature and humidity
of each piece of equipment at 50 mm (approximately 2 in.) from the front at
the geometric center of each piece of equipment, as shown in Figure 4.4.
• All temperature and humidity levels should fall within the specifications for
the class environment specified in Table 2.3 (see Chapter 2). If any measure-
ment falls outside of the desired operating condition as specified by
Chapter 2, the facility operations personnel may wish to consult with the
equipment manufacturer regarding the risks involved.
• This test is the same as that in paragraph 1 of Section 4.2 above, except that
the temperature and humidity across the entire intake of the problematic piece
of equipment are monitored. The objective here is to determine if air is being
drawn into the equipment within the allowable conditions specified for the
class environment shown in Table 2.3 (see Chapter 2).
58 Facility Temperature and Humidity Measurement
the equipment manufacturer regarding the risks involved or to correct the out-
of-range condition.
Figure 5.5 Example of hot and cold aisles for nonraised floor.
intakes of the electronic equipment and allow for the efficient extraction of the
warmed air discharged by the equipment.
Recirculation can be reduced through tight cabinet placement and the use of
equipment blanking panels, as described in Section 5.1.3. It is the responsibility of
the facility operations personnel to determine the best way to implement hot-aisle/
cold-aisle configurations. Figure 5.4 shows an example of this configuration using
underfloor cooling found in a typical data center.
Figure 5.5 shows a typical non-raised-floor implementation. The overhead
ventilation system uses multiple air diffusers that inject cool air vertically (down-
ward) into the cold aisles.
to predict without training and tools. To make the task easier, keep equipment with
the same type of airflow patterns together, with all exhausts in the same direction
toward the hot aisle.
In implementations that do not use the hot-aisle/cold-aisle configuration,
warmed air discharged from the rear of one cabinet can be drawn into the front of
a nearby cabinet. This warmed air can be further warmed by the next row of equip-
ment and so on. This can create a potentially harmful situation for the equipment in
the cabinets farther to the rear. If not addressed, this condition would contribute to
increased equipment failures and system downtime. Therefore, place cabinets that
cannot use hot-aisle/cold-aisle configurations together in another area of the data
center, being careful to ensure that exhaust patterns from various equipment sections
are not drawn into equipment inlets. Again, the temperature measurements can
document the effect of recirculated hot air and should be compared to the recom-
mended and allowable temperature ranges.
Aisle pitch is defined as the distance between the center of the reference cold aisle
and the center of the next cold aisle in either direction. A common aisle pitch for data
centers is seven floor tiles, based on two controlling factors. First, it is advisable to
allow a minimum of one complete floor tile in front of each rack. Second, maintain-
ing a minimum of three feet in any aisle with wheelchair access may be required by
Section 5.2 of the Americans with Disabilities Act (ADA), Document 28, CFR Part
36 (DOJ 1994). Based on the standard-sized domestic floor tile, these two factors
result in a seven-tile pitch, allowing two accessible tiles in the cold aisle, 914.4 mm
(3 ft) in the hot aisle, and reasonably deep rack equipment, as shown in Figure 5.6.
Table 5.1 lists potential equipment depth for a seven-tile pitch.
66 Equipment Placement and Airflow Patterns
a. If considering a pitch other than seven floor tiles, it is advised to increase or decrease the pitch in
whole tile increments. Any overhang into the cold aisle should take into account the specific design
of the front of the rack and how it affects access to and flow through the tile.
b. Nominal dimension assumes no overhang; less if front door overhang exists.
c. Typically a one metre rack is 1070 mm deep with the door and would overhang the front tile 3 mm
for a U.S. configuration and 27 mm for global configuration.
Some installations require that the rear of a cabinet line up with the edge of a
removable floor tile to facilitate underfloor service, such as pulling cables. Adding
this constraint to a seven-tile pitch results in a 4 ft-wide hot aisle and forces a cold
aisle of less than 4 ft, with only one row of vented tiles and more limited cooling
capacity, as shown in Figure 5.7.
For larger cabinet sizes and/or higher power density equipment, it may be
advantageous to use an eight-tile pitch. Similarly, smaller equipment, especially
telecom form factors, can take advantage of tighter pitches. For example, a published
ANSI standard defines a Universal Telecommunication Framework (UTF) (ANSI
Thermal Guidelines for Data Processing Environments, Third Edition 67
2009), where the baseline depth of the frame shall be 600 mm (23.6 in.); deeper
equipment may be permitted in special deeper lineups of 750 mm (29.5 in.) or 900
mm (35.4 in.) depths. All configurations need to be examined on a case-by-case
basis.
Aisle pitch determines how many perforated floor tiles can be placed in a cold
aisle. The opening in the tile together with the static pressure in the raised floor
plenum determines how much supply airflow is available to cool the ITE. Chapter
6 discusses how to determine the volume of airflow required by the design of the ITE.
6
Equipment Manufacturers’
Heat and Airflow Reporting
Chapter 6 provides guidance to users for estimating heat release from information
technology equipment (ITE) similar to what was developed by Telcordia in GR-
3028-CORE (2001) for the telecom market. Some ITE manufacturers provide
sophisticated tools to more accurately assess power and airflow consumption. When
available, the manufacturer should be consulted and data from their tools should be
used to provide more specific information than may be available in the thermal
report.
• Steady state
• User controls or programs set to a utilization rate that maximizes the number
of simultaneous components, devices, and subsystems that are active
• Nominal voltage input
• Ambient temperature between 18°C and 27°C (64.4°F and 80.6°F)
• Air-moving devices at nominal speed
Airflow values should be reflective of those that would be seen in a data center.
Representative racking, cabling, and loading should be taken into account in airflow
70 Equipment Manufacturers’ Heat and Airflow Reporting
reporting. Some ITE manufacturers employ variable-speed fans, which can result in
a large variance in airflow due to equipment loading and ambient conditions.
Airflow reporting should be based on the following conditions:
• The values predicted for tested configurations are within 10% of the measured
values.
• When the predicted values vary by more than 10% from the measured values,
the predictive algorithm is updated and revalidated.
Condition
(W × D × H)
Typical Heat Release Airflowa, Airflow,
(@110 V) Nominal Maximum @ 35°C (95°F)
1765 400 680 600 1020 896 406 30 × 42 × 72 762 × 1016 × 1828
Full
10,740 750 1275 1125 1913 1528 693 61 × 40 × 72 1549 × 1016 × 1828
Typical
5,040 555 943 833 1415 1040 472 30 × 40 × 72 762 × 1016 × 1828
Table 6.1 Example Thermal Report (Continued)
Airflow Diagram
Minimum Configuration 1 CPU-A, 1GB, 2 I/O
Cooling Scheme F-R
ASHRAE
Class
A1, A2
4 CPU-A, 8 GB, 32 I/O
Typical Configuration
(2 GB cards, 1 frame)
a Airflow values are for an air density of 1.2 kg/m3 (0.075 lb/ft3). This corresponds to air at 18°C (64.4°F), 101.3 kPA (14.7 psia), and 50% RH.
b Footprint does not include service clearance or cable management, which is zero on the sides, 46 in . (1168 mm) in the front, and 40 in . (1016 mm) in the rear.
74 Equipment Manufacturers’ Heat and Airflow Reporting
A template for the current recommended PPDS can be found on the ENERGY
STAR Web site (DOE2012a).
Note that in item 8 above, the ASHRAE thermal report is called out directly.
The EPA requires that, beyond the server configuration, the manufacturer also
report the following:
• Total power dissipation in watts (Note this is not the same value as the electri-
cal nameplate data, which should not be used for sizing cooling systems.)
• Delta temperature at 35°C (95°F) inlet
• Airflow (cfm) at max fan speed at max inlet temperature (35°C [95°F])
• Airflow (cfm) at nominal fan speed at nominal temperature (18°C to 27°C
[64.4°F to 80.6°F])
The version 1.0 ENERGY STAR specification is built around a Class A2 server.
The version 2.0 specification is still under development but can be expected to add
additional reporting requirements to what is already in place.
As more ENERGY STAR servers (or any servers with the above capability)
become more prevalent, the ability to provide a higher level of integration between
IT management and the building management systems will allow the data center
designer and operator to more fully optimize the data center for maximum efficiency.
References and Bibliography
ACGIH. 1992. 1992–1993 Threshold Limit Values for Chemical Substances and
Physical Agents and Biological Exposure Indices. Cincinnati: American Con-
ference of Governmental Industrial Hygienists.
ANSI. 1997. ANSI T1.304-1997, Ambient Temperature and Humidity Require-
ments for Network Equipment in Controlled Environments. New York: Amer-
ican National Standards Institute.
ANSI. 2009. Engineering Requirements for a Universal Telecommunication
Framework (UTF), Standards Committee T1 Telecommunications, Working
Group T1E1.8, Project 41. New York: American National Standards Institute.
ASHRAE. 2004. Thermal Guidelines for Data Processing Environments. Atlanta:
ASHRAE.
ASHRAE. 2005a. Datacom Equipment Power Trends and Cooling Applications.
Atlanta: ASHRAE.
ASHRAE. 2005b. Design Considerations for Datacom Equipment Centers.
Atlanta: ASHRAE.
ASHRAE. 2006. Liquid Cooling Guidelines for Datacom Equipment Centers.
Atlanta: ASHRAE.
ASHRAE. 2008. Thermal Guidelines for Data Processing Environments, Second
Edition. Atlanta: ASHRAE.
ASHRAE. 2009a. ASHRAE Handbook—Fundamentals. Atlanta: ASHRAE.
ASHRAE. 2009b. Design Considerations for Datacom Equipment Centers, Sec-
ond Edition. Atlanta: ASHRAE.
ASHRAE. 2009c. Particulate and Gaseous Contamination in Datacom Environ-
ments. Atlanta: ASHRAE.
ASHRAE. 2009d. Weather Data Viewer Software, Version 4. Atlanta: ASHRAE.
ASHRAE. 2010. ANSI/ASHRAE/IES Standard 90.1-2010, Energy Standard for
Buildings Except Low-Rise Residential Buildings. Atlanta: ASHRAE.
ASHRAE. 2011. ASHRAE Handbook—HVAC Applications. Atlanta:
ASHRAE.
ASHRAE. 2012. Datacom Equipment Power Trends and Cooling Applications,
Second Edition. Atlanta: ASHRAE.
Atwood, D., and Miner J. 2008. Reducing data center cost with an air economizer.
Brief, Intel Corporation, Santa Clara, CA. http://www.intel.com/content/
www/us/en/data-center-efficiency/data-center-efficiency-xeon-reducing-data-
center-cost-with-air-economizer-brief.html.
Blinde, D., and L. Lavoie. 1981. Quantitative effects of relative and absolute
humidity on ESD generation/suppression. Proceedings of EOS/ESD Sympo-
sium, vol. EOS-3, pp. 9–13.
134 References and Bibliography
Comizzoli, R.B., R.P. Frankenthal, R.E. Lobnig, G.A. Peins, L.A. Psato-Kelty,
D.J. Siconolfi, and J.D. Sinclair. 1993. Corrosion of electronic materials and
devices by submicron atmospheric particles. The Electrochemical Society
Interface 2(3):26–34.
Cohen, J.E., and C. Small. 1998. Hypsographic demography: The distribution of
human population by altitude. Proceedings of the National Academy of Sci-
ences of the United States of America 95(24):14009–14.
DOE. 2012a. ENERGY STAR Enterprise Servers. U.S. Environmental Protection
Agency, U.S. Department of Energy, Washington, D.C. http://www.energys-
tar.gov/index.cfm?c=archives.enterprise_servers.
DOE. 2012b. ENERGY STAR Server Data Sheet. U.S. Environmental Protection
Agency, U.S. Department of Energy, Washington, D.C. http://www.energys-
tar.gov/ia/partners/prod_development/new_specs/downloads/servers/
Server_Data_Sheet_04-24-09.xls?88d8-ab69.
DOJ. 1994. Code of Federal Regulations, Standards for Accessible Design, 28
CFR Part 36, Section 4.3.3 Width. Revised as of July 1, 1994. U.S. Depart-
ment of Justice, ADA Standards for Accessible Design, Washington, D.C.
http://www.usdoj.gov/crt/ada/adastd94.pdf.
EIA. 1992. EIA-310, Revision D, Sept. 1, 1992, Racks, Panels and Associated
Equipment. Electronic Industries Association.
ETSI. 2009. ETSI EN 300 753 V1.2.1 (2009-03), Equipment Engineering (EE);
Acoustic Noise Emitted by Telecommunications Equipment. http://
webapp.etsi.org/workprogram/Report_WorkItem.asp?WKI_ID=3392.
ETSI. 1994. ETSI 300 019-1-0 (1994-05), Equipment Engineering (EE); Environ-
mental Conditions and Environmental Tests for Telecommunications Equip-
ment Part 1-0: Classification of Environmental Conditions, European
Telecommunications Standards Institute, France.
ETSI. 1999. ETSI EN 300 019-2-3 V2.1.2 (1999-09), Equipment Engineering
(EE), Environmental Conditions and Environmental Tests for Telecommunica-
tions Equipment; Part 2-3: Specification of Environmental Tests; Stationary
Use at Weather Protected Locations. European Telecommunications Stan-
dards Institute, France.
ETSI. 2009. ETSI EN 300 019-1-3 V2.3.2, Equipment Engineering (EE); Envi-
ronmental Conditions and Environmental Tests for Telecommunications
Equipment; Part 1–3: Classification of Environmental Conditions; Stationary
Use at Weatherprotected Locations. European Telecommunications Standards
Institute, France.
Europa. 2003. Directive 2003/10/EC on the minimum health and safety require-
ments regarding the exposure of workers to the risks arising from physical
agents (noise). European Agency for Safety and Health at Work, Bilbao,
Spain. http://osha.europa.eu/en/legislation/directives/exposure-to-physical-
hazards/osh-directives/82.
GPO. 1976. U.S. Standard Atmosphere. U.S. Government Printing Office, Wash-
ington, D.C.
Hamilton, P., G. Brist, G. Barnes Jr., and J. Schrader. 2007. Humidity-dependent
loss in PCB substrates. Proceedings of the Technical Conference, IPC Expo/
APEX 2007, February 20–22, Los Angeles, CA.
Thermal Guidelines for Data Processing Environments, Third Edition 135
Herrlin, M.K. 2005. Rack cooling effectiveness in data centers and telecom central
offices: The rack cooling index (RCI). ASHRAE Transactions 111(2).
Hinaga, S., M.Y. Koledintseva, J.L. Drewniak, A. Koul, and F. Zhou. 2010. Ther-
mal effects on PCB laminate material dielectric constant and dissipation fac-
tor. IEEE Symposium on Electromagnetic Compatibility, July 25–30, Fort
Lauderdale, FL.
HSE. 1992. Workplace Health, Safety and Welfare; Workplace (Health, Safety and
Welfare) Regulations 1992; Approved Code of Practice. Liverpool, Mersey-
side, England: Health and Safety Executive.
IEC. 1999. IEC 60950, Safety of Information Technology Equipment, 3d ed.
Geneva, Switzerland: International Electrotechnical Commission.
IEEE. 1997. IEEE Standard 1156.4-1997, IEEE Standard for Environmental Spec-
ifications for Spaceborne Computer Modules. New York: Institute of Electri-
cal and Electronics Engineers.
ISO 1988. ISO 9296, Acoustics—Declared Noise Emission Values of Computer
and Business Equipment. Geneva, Switzerland: International Organization for
Standardization.
ISO 2010. ISO 7779, Acoustics Measurement of Airborne Noise Emitted by Infor-
mation Technology and Telecommunications Equipment, Third Edition.
Geneva, Switzerland: International Organization for Standardization.
Jones, D.A. 1996. Principles and Prevention of Corrosion, 2nd Edition. Upper
Saddle River, NJ: Prentice Hall.
Montoya. 2002. Sematech electrostatic discharge impact and control workshop,
Austin, TX. http://ismi.sematech.org/meetings/archives/other/20021014/mon-
toya.pdf.
NIOSH. 2012. NIOSH Publications and Products, Working in Hot Environments.
Centers for Disease Control and Prevention, National Institute for Occupa-
tional Safety and Health (NIOSH), Atlanta, GA. http://www.cdc.gov/niosh/
docs/86-112/.
OSHA. 1980. Noise Control: A Guide for Workers and Employers. U.S. Depart-
ment of Labor, OSHA, Office of Information, Washington, D.C. http://
www. nonoise.org/hearing/noisecon/noisecon.htm.
OSHA. 1999. OSHA Technical Manual, Section III, Chapter 4, “Heat stress.”
Directive #TED 01-00-015, U.S. Department of Labor, Occupational Safety
and Health Administration, Washington, D.C. http://www.osha.gov/dts/osta/
otm/otm_iii/otm_iii_4. html.
OSHA. 2012a. Occupational Heat Exposure. U.S. Department of Labor, Occupa-
tional Safety and Health Administration, Washington, D.C. http://
www.osha.gov/SLTC/heatstress/index.html.
OSHA. 2012b. Occupational Safety and Health Act of 1970, General Duty Clause,
Section 5(a)(1). U.S. Department of Labor, Occupational Safety and Health
Administration, Washington, D.C. http://www.osha.gov/pls/oshaweb/owa-
disp.show_document?p_id=3359&p_table=OSHACT.
Patterson, M.K. 2008. The effect of data center temperature on energy efficiency.
Proceedings of Itherm Conference, Orlando, Florida.
136 References and Bibliography
Patterson, M.K., D. Atwood, and J.G. Miner. 2009. Evaluation of air-side econo-
mizer use in a compute-intensive data center. Interpack’09, July 19–23, San
Francisco, CA, pp. 1009–14.
Sauter, K. 2001. Electrochemical migration testing results—Evaluating printed
circuit board design, manufacturing process and laminate material impacts on
CAF resistance. Proceedings of IPC Printed Circuits Expo, Anaheim, CA.
Simonic, R. 1982. ESD event rates for metallic covered floor standing information
processing machines. Proceedings of the IEEE EMC Symposium, Santa
Clara, CA, pp. 191–98.
Singh, P. 1985. Private communication with the author, Poughkeepsie, NY.
Singh, P., G.T. Gaylon, J.H. Dorler, J. Zahavi, and R. Ronkese. 1992. Potentiody-
namic polarization measurements for predicting pitting of copper in cooling
waters. Corrosion 92, The NACE Annual Conference and Corrosion Show,
Nashville, TN.
Sood, B. 2010. Effects of Moisture Content on Permittivity and Loss Tangent of
PCB Materials. Webinar, Center for Advanced Life Cycle Engineering
(CALCE), University of Maryland, College Park.
Statskontoret. 2002. Technical Standard 26:5, Acoustical Noise Emissions of
Information Technology Equipment. Stockholm, Sweden. http://www-
05.ibm.com/se/ibm/environment/pdf/buller-TN26-5.pdf.
Statskontoret. 2004. Technical Standard 26:6, Acoustical Noise Emission of Infor-
mation Technology Equipment. Stockholm, Sweden. http://arkiv.edelega-
tionen.se/verva/upload/publikationer/2004/2004-TN26-6-Acoustical-Noice-
Emission.pdf.
Telcordia. 2001. GR-3028-CORE, Thermal management in telecommunications
central offices. Telcordia Technologies Generic Requirements, Issue 1,
December 2001. Piscataway, N.J.: Telcordia Technologies, Inc.
Telcordia. 2012. GR-63-CORE, Network equipment—Building system (NEBS)
requirements: Physical protection. Telcordia Technologies Generic Require-
ments, Issue 4, April 2012. Piscataway, N.J.: Telcordia Technologies, Inc.
Turbini, L.J., and W.J. Ready. 2002. Conductive anodic filament failure: A materi-
als perspective. School of Materials Science and Engineering, Georgia Insti-
tute of Technology, Atlanta, GA. http://130.207.195.147/Portals/2/12.pdf.
Turbini, L.J., W.J. Ready, and B.A. Smith. 1997. Conductive anodic filament
(CAF) formation: A potential reliability problem for fine-line circuits. The
Reliability Lab School of Materials Science and Engineering, Georgia Insti-
tute of Technology.
Van Bogart, W.C., 1995. Magnetic tape storage and handling: A guide for libraries
and archives. National Media Laboratory, Oakdale, MN. http://www.clir.org/
pubs/reports/pub54/5premature_degrade.html.
Appendix A
2008 ASHRAE Environmental Guidelines
for Datacom Equipment—Expanding the
Recommended Environmental Envelope
Most of the discussion in Appendix A is taken from the 2008 second edition of Ther-
mal Guidelines. The information here has been updated for the current edition.
The recommended environmental envelope for IT equipment (ITE) is listed in
Table 2.1 of ASHRAE’s Thermal Guidelines for Data Processing Environments
(2004) (see Table A.1 for a comparison between the 2004 and 2008 versions). These
recommended conditions, as well as the allowable conditions, refer to the inlet air
entering the datacom equipment. Specifically, the 2004 edition lists for data centers
in ASHRAE Classes 1 and 2 a recommended environmental range of 20°C to 25°C
(68°F to 77°F) dry-bulb temperature and a relative humidity (RH) range of 40% to
55% (refer to Thermal Guidelines for details on data center type, altitude, recom-
mended vs. allowable, etc.). The allowable and recommended envelopes for Class 1
are shown in Figure A.1.
To provide greater flexibility in facility operations, particularly with the goal of
reduced energy consumption in data centers, ASHRAE TC 9.9 underwent an effort
to revisit the recommended equipment environmental specifications (ASHRAE
2008), specifically the recommended envelope for Classes 1 and 2 (the recom-
mended envelope is the same for both of these environmental classes). The result of
this effort, detailed in this appendix, was to expand the recommended operating envi-
ronment envelope as shown in Table A.1. The purpose of the recommended envelope
was to give guidance to data center operators on maintaining high reliability and also
operating data centers in the most energy-efficient manner.
The allowable envelope is where IT manufacturers test their equipment in order
to verify that it will function within those environmental boundaries. Typically,
manufacturers perform a number of tests prior to the announcement of a product to
verify that it meets all the functionality requirements within this environmental enve-
lope. This is not a statement of reliability but one of functionality of the ITE.
Figure A.2 Inlet and component temperatures with fixed fan speed.
Figure A.3 Inlet and component temperatures with variable fan speed.
to the ambient air temperature. Above this inlet temperature, the fan adjusts flow rate
such that the component temperature is maintained at a relatively constant temper-
ature.
This data brings up several important observations:
• Below a certain inlet temperature (23°C [73.4°F] in the case described above),
IT systems that employ variable-speed air-moving devices have constant fan
power, and their component temperatures track fairly closely to ambient tem-
perature changes. Systems that don’t employ variable-speed air-moving
devices track ambient air temperatures over the full range of allowable ambi-
ent temperatures.
• Above a certain inlet temperature (23°C [73.4°F] in the case described
above), the speed of the air-moving device increases to maintain fairly con-
80 2008 ASHRAE Environmental Guidelines for Datacom Equipment
As shown in Figure A.3, the IT fan power can increase dramatically as it ramps
up speed to counter the increased inlet ambient temperature. The graph shows a typi-
cal power increase that results in the near-constant component temperature. In this
case, the fan power increased from 11 W at ~23°C (73.4°F) inlet temperature to over
60 W at 35°C (95°F) inlet temperature. The inefficiency in the power supply results
in an even larger system power increase. The total room power (facilities + IT) may
actually increase at warmer temperatures. IT manufacturers should be consulted
when considering system ambient temperatures approaching the upper recom-
mended ASHRAE temperature specification. See Patterson (2008) for a technical
evaluation of the effect of increased environmental temperature, where it was shown
that an increase in temperature can actually increase energy use in a standard data
center but reduce it in a data center with economizers in the cooling system.
Because of the derating of the maximum allowable temperature with altitude for
Classes 1 and 2, the recommended maximum temperature is derated by 1°C/300 m
(1.8°F/984 ft) above 1800 m (5906 ft).
MOISTURE LIMITS
High End
Based on extensive reliability testing of printed circuit board laminate materials, it
was shown that conductive anodic filament (CAF) growth is strongly related to RH
(Sauter 2001). As humidity increases, time to failure rapidly decreases. Extended
periods of RH exceeding 60% can result in failures, especially given the reduced
conductor-to-conductor spacing common in many designs today. The CAF mecha-
nism involves electrolytic migration after a path is created. Path formation could be
due to a breakdown of inner laminate bonds driven by moisture, which supports the
electrolytic migration and explains why moisture is so key to CAF formation. The
upper moisture region is also important for disk and tape drives. In disk drives, there
are head flyability and corrosion issues at high humidity. In tape drives, high humid-
ity can increase frictional characteristics of tape and increase head wear and head
corrosion. High RH, in combination with common atmospheric contaminants, is
required for atmospheric corrosion. The humidity forms monolayers of water on
surfaces, thereby providing the electrolyte for the corrosion process. Sixty percent
RH is associated with adequate monolayer buildup for monolayers to begin taking
on fluid-like properties. Combined with humidity levels exceeding the critical equi-
librium humidity of a contaminant’s saturated salt, hygroscopic corrosion product
is formed, further enhancing the buildup of acid-electrolyte surface wetness and
greatly accelerating the corrosion process. Although disk drives do contain internal
Thermal Guidelines for Data Processing Environments, Third Edition 81
means to control and neutralize pollutants, maintaining humidity levels below the
critical humidity levels of multiple monolayer formation retards initiation of the
corrosion process.
A maximum recommended dew point of 15°C (59°F) is specified to provide an
adequate guard band between the recommended and allowable envelopes.
Low End
The motivation for lowering the moisture limit is to allow a greater number of hours
per year where humidification (and its associated energy use) is not required. The
previous recommended lower limit was 40% RH. This correlates on a psychrometric
chart to 20°C (68°F) dry-bulb temperature and a 5.5°C (41.9°F) dew point (lower
left) and a 25°C (77°F) dry-bulb and a 10.5°C (50.9°F) dew point (lower right). The
dryer the air is, the greater the risk of electrostatic discharge (ESD). The main
concern with decreased humidity is that the intensity of static electricity discharges
increases. These higher-voltage discharges tend to have a more severe impact on the
operation of electronic devices, causing error conditions requiring service calls and,
in some cases, physical damage. Static charges of thousands of volts can build up on
surfaces in very dry environments. When a discharge path is offered, such as a main-
tenance activity, the electric shock of this magnitude can damage sensitive electron-
ics. If the humidity level is reduced too far, static dissipative materials can lose their
ability to dissipate charge and then become insulators.
The mechanism of the static discharge and the impact of moisture in the air are
not widely understood. Montoya (2002) demonstrates, through a parametric study,
that ESD charge voltage level is a function of dew point or absolute humidity in the
air and not a function of RH. Simonic (1982) studied ESD events across various
temperature and moisture conditions over a period of a year and found significant
increases in the number of events (20×) depending on the level of moisture content
(winter vs. summer months). It was not clear whether the important parameter was
absolute humidity or RH.
Blinde and Lavoie (1981) studied electrostatic charge decay (vs. discharge) of
several materials and showed that it is not sufficient to specify environmental ESD
protection in terms of absolute humidity; nor is a RH specification sufficient, since
temperature affects ESD parameters other than atmospheric moisture content.
The 2004 recommended range includes a dew-point temperature as low as 5.5°C
(41.9°F). Discussions with ITE manufacturers indicated that there have been no
known reported ESD issues within the 2004 recommended environmental limits. In
addition, the referenced information on ESD mechanisms (Montoya 2002; Simonic
1982; Blinde and Lavoie 1981) does not suggest a direct RH correlation with ESD
charge creation or discharge, but Montoya (2002) does demonstrate a strong corre-
lation of dew point to charge creation, and a lower humidity limit, based upon a mini-
mum dew point (rather than minimum RH), is proposed. Therefore, the 2008
recommended lower limit is a line from 18°C (64.4°F) dry-bulb temperature and
5.5°C (41.9°F) dew-point temperature to 27°C (80.6°F) dry-bulb temperature and a
5.5°C (41.9°F) dew-point temperature. Over this range of dry-bulb temperatures and
a 5.5°C (41.9°F) dew point, the RH varies from approximately 25% to 45%.
82 2008 ASHRAE Environmental Guidelines for Datacom Equipment
Another practical benefit of this change is that process changes in data centers
and their HVAC systems, in this area of the psychrometric chart, are generally sensi-
ble only (i.e., horizontal on the psychrometric chart). Having a limit of RH greatly
complicates the control and operation of the cooling systems and could require
added humidification operation at a cost of increased energy in order to maintain an
RH when the space is already above the needed dew-point temperature. To avoid
these complications, the hours of economizer operation available using the 2004
guidelines were often restricted.
ASHRAE has a research project to investigate moisture levels and ESD with the
hope of driving the recommended range to a lower moisture level in the future. ESD
and low moisture levels can result in drying out of lubricants, which can adversely
affect some components. Possible examples include motors, disk drives, and tape
drives. While manufacturers indicated acceptance of the environmental extensions
documented here, some expressed concerns about further extensions. Another
concern for tape drives at low moisture content is the increased tendency to collect
debris on the tape and around the head and tape transport mechanism due to static
buildup.
Europe) or 90 dB(A) (in the U.S.), further action, such as mandatory hearing protec-
tion, rotation of employees, or engineering controls, must be taken. Data center
managers should consult with acoustical or industrial hygiene experts to determine
whether a noise exposure problem will result from increasing ambient temperatures
to the 2008 upper recommended limit.
Data Center Operation Scenarios for
ASHRAE’s 2008 Recommended Environmental Limits
The recommended ASHRAE guideline is meant to give guidance to IT data center
operators on the inlet air conditions to the ITE for the most reliable operation. Four
possible scenarios where data center operators may elect to operate at conditions
that lie outside the recommended environmental window are listed as follows.
1. Scenario #1: Expand economizer use for longer periods of the year where
hardware fails are not tolerated.
• For short periods of time, it is acceptable to operate outside this recom-
mended envelope and approach the allowable extremes. All manufactur-
ers perform tests to verify that the hardware functions at the allowable
limits. For example, if during the summer months it is desirable to oper-
ate for longer periods of time using an economizer rather than turning on
the chillers, this should be acceptable, as long as the period of warmer
inlet air temperatures to the datacom equipment does not exceed several
days each year; otherwise, the long-term reliability of the equipment
could be affected. Operation near the upper end of the allowable range
may result in temperature warnings from the ITE.
2. Scenario #2: Expand economizer use for longer periods of the year where
limited hardware fails are tolerated.
• All manufacturers perform tests to verify that the hardware functions at
the allowable limits. For example, if during the summer months it is
desirable to operate for longer periods of time using the economizer
rather than turning on the chillers, and if the data center operation is such
that periodic hardware fails are acceptable, then operating for extended
periods of time near or at the allowable limits may be acceptable. This, of
course, is a business decision of where to operate within the allowable
and recommended envelopes and for what periods of time. Operation
near the upper end of the allowable range may result in temperature
warnings from the ITE.
3. Scenario #3: Failure of cooling system or servicing cooling equipment.
• If the system was designed to perform within the recommended environ-
mental limits, it should be acceptable to operate outside the recom-
mended envelope and approach the extremes of the allowable envelope
during the failure. All manufacturers perform tests to verify that the hard-
ware functions at the allowable limits. For example, if a modular CRAC
unit fails in the data center, and the temperatures of the inlet air of the
nearby racks increase beyond the recommended limits but are still within
the allowable limits, this is acceptable for short periods of time until the
failed component is repaired. As long as the repairs are completed within
84 2008 ASHRAE Environmental Guidelines for Datacom Equipment
normal industry times for these types of failures, this operation should be
acceptable. Operation near the upper end of the allowable range may
result in temperature warnings from the ITE.
4. Scenario #4: Addition of new servers that push the environment beyond the
recommended envelope.
• For short periods of time, it should be acceptable to operate outside the
recommended envelope and approach the extremes of the allowable enve-
lope. All manufacturers perform tests to verify that the hardware func-
tions at the allowable limits. For example, if additional servers are added
to the data center in an area that would increase the inlet air temperatures
to the server racks above the recommended limits but adhere to the allow-
able limits, this should be acceptable for short periods of time until the
ventilation can be improved. The length of time operating outside the rec-
ommended envelope is somewhat arbitrary, but several days would be
acceptable. Operation near the upper end of the allowable range may
result in temperature warnings from the ITE.