Professional Documents
Culture Documents
WP The Data Center of The Future Reaching Sustainability en
WP The Data Center of The Future Reaching Sustainability en
Reaching Sustainability
The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Data Centers
of the Future, the cover image, and related trade dress are trademarks of O’Reilly
Media, Inc.
The views expressed in this work are those of the authors and do not represent the
publisher’s views. While the publisher and the authors have used good faith efforts
to ensure that the information and instructions contained in this work are accurate,
the publisher and the authors disclaim all responsibility for errors or omissions,
including without limitation responsibility for damages resulting from the use of
or reliance on this work. Use of the information and instructions contained in this
work is at your own risk. If any code samples or other technology this work contains
or describes is subject to open source licenses or the intellectual property rights of
others, it is your responsibility to ensure that your use thereof complies with such
licenses and/or rights.
This work is part of a collaboration between O’Reilly and Equinix. See our statement
of editorial independence.
978-1-492-09898-0
[LSI]
Table of Contents
v
Data Centers of the Future
1
data, all customers want their data as soon as possible. After all,
you don’t want to wait a few seconds for that Netflix movie to
start. Automated stock-trading software needs to place bids in milli‐
seconds, before competitors can bid. Minimizing latency must be
factored into the design and location of data centers, leading to the
deployment of thousands of new, “far edge” data centers.
A third trend: the world is getting warmer. Greenhouse gases have a
real effect on the climate. The goal of the Paris Agreement is to limit
global warming to no more than 2°C and ideally less than 1.5°C.
As temperatures climb and natural disasters become more common,
public pressure to “do something” about greenhouse gas emissions
increases. That public pressure has become customer and regulatory
pressure. And it should. Whatever the industry, companies need to
consider themselves stewards of the world we share. Data centers
are no exception. Environmental stewardship will be key to our
survival.
Data centers sit at the intersection of all these trend vectors. They
use a lot of power (currently estimated at over 1% of the world’s
electricity), and increased demand for data guarantees that power
usage will only increase, even as data centers become more efficient.
The US Energy Information Administration estimates that world‐
wide energy use will increase 50% by 2050, driven by economic
growth worldwide rather than any specific industry. Latency consid‐
erations mean that data centers need to be located near population
centers where renewable energy may be unavailable; locating them
in areas with rich hydroelectric, wind, and solar power is costly,
and it’s rarely an option if latency is a consideration. Data center
customers—the businesses whose servers are located within the data
center—are applying pressure on data center operators to reduce
greenhouse gas emissions. Legislative pressure on data center opera‐
tors is also increasing: Singapore enacted a moratorium on new data
center construction (recently lifted), and the European Union has
put in place regulations about server efficiency, codes of conduct for
data center operators, and more. The US Securities and Exchange
Commission (SEC) has proposed a rule requiring corporations to
disclose climate-related data, including greenhouse gas emissions, to
investors. The news cycle also applies pressure. You don’t have to
look far to read about people’s negative perceptions of data centers,
whether they’re big internet server farms, massive cryptocurrency
mining camps, or large colocation facilities with global reach.
3 Power transmission varies from country to country and from grid to grid. This discus‐
sion describes North American practices; other countries will be similar.
Figure 2. From the utility to the servers: how power gets to the racks
reliably
Air Cooling
Currently, data centers are predominantly air cooled. The data cen‐
ter industry is trending toward various forms of liquid cooling, but
we certainly wouldn’t say that a climate-neutral data center can’t
be air cooled. Air cooling means, more or less, what you think:
many fans circulating lots of air over the equipment that needs to
be cooled. Air-conditioning units guarantee that the incoming air
is at an appropriate temperature. A data center also needs to filter
and (in many locations) dehumidify the incoming air. After it has
passed through the data center, hot air may be exhausted back to the
environment, or it may go through the air-conditioning system to be
reused.
The hotter the air supply can be, the more efficient the cooling
system. Google’s air-cooled data centers can operate with the incom‐
ing air as high as 80°F (26.6°C), which means that they can often
use outside air directly, without any additional cooling. Outside air
must be chilled during a hot spell, but allowing incoming air to
be at a higher temperature minimizes the demand for cooling. The
ASHRAE A1 Allowable recommendations for incoming air are a
temperature of 15° to 32°C (59° to 89.6°F), with a relative humidity
of 8% to 80% and a dew point of −12° to 17°C (10.4° to 62.6°F).
The widespread practice for organizing server racks in an air-cooled
data center is to separate them into hot and cold aisles. Cool air
is supplied to the cold aisles. Hot air is exhausted to the hot aisles
from the other side of the racks. In practice, this means that the air
intakes from two adjacent rows of racks face each other; likewise,
the exhaust sides of two adjacent racks face each other. Preventing
the incoming cold air from mixing with the hot exhaust makes
the cooling system more efficient—as anyone who has ever stood
behind a rack can appreciate.
Air cooling has the advantage that it can be used with almost any
hardware; remember that data centers offering colocation have limi‐
ted control over their customers’ servers. Liquid cooling requires
significantly more standardization. Another advantage of air cooling
is that, if the cooling fails, you have a few minutes to turn the servers
off before they overheat. With liquid cooling, thermal runaway hap‐
pens much sooner; if the cooling system fails, servers need to be
shut down in seconds before they’re damaged. Finally, air cooling
can be used in almost any building, including older data centers
and older buildings being repurposed as data centers; liquid cooling
requires significantly more changes to the building’s infrastructure.
Liquid Cooling
Water cooling is one of the oldest technologies for cooling electron‐
ics. Water cooling was used for high-power radio transmitters as
early as the 1920s, significantly before air cooling was an option.
Water cooling was also common for large mainframes in the 1970s;
in the late ’70s, Bell Labs’ Holmdel site had a reflecting pond with
a fountain that provided evaporative cooling for its mainframe com‐
puter systems; in the winter, the hot water was used to heat the
building.