Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Energy 258 (2022) 124787

Contents lists available at ScienceDirect

Energy
journal homepage: www.elsevier.com/locate/energy

Development of an independent modular air containment system for


high-density data centers: Experimental investigation of row-based
cooling performance and PUE
Jinkyun Cho*, Beungyong Park, Seungmin Jang
Department of Building and Plant Engineering, Hanbat National University, 34158, Daejeon, Republic of Korea

a r t i c l e i n f o a b s t r a c t

Article history: This study presents a prototype of an independent modular air containment (MAC) system that over-
Received 3 February 2022 comes the limitations of the existing room-based cooling and applies a row-based cooling system for a
Received in revised form high-density data center that can meet the requirements of energy efficiency. Furthermore, to improve
6 May 2022
the cooling efficiency and safety, we developed a novel in-row cooling package with multiple heat-
Accepted 8 July 2022
transfer media (a sequential water-to-refrigerant-to-air heat exchange system). While complying with
Available online 13 July 2022
the standard test method, the objective cooling performance and PUE are derived through in-situ
measurement in connection with the actual operation of a reference data center, and the energy
Keywords:
Data center
saving contribution is analyzed. As observed from the result of the experimental measurement for the
Row-based cooling MAC with the novel in-row cooling package, the thermal balance is maintained between the water to
Air containment system refrigerant primary cycle and the refrigerant to air secondary cycle, and the air temperature in the cold
Cooling performance aisle converges to 23  C, satisfying the ASHRAE thermal guidelines. The energy influence of the in-row
In-situ measurement cooling package is extremely low with average 0.019 pPUE. For a reference data center, the analyzed
PUE energy efficiency can be improved from baseline PUE 1.563 to PUE 1.361 through 43.5% reduction in the
Power usage effectiveness cooling system.
© 2022 Elsevier Ltd. All rights reserved.

1. Introduction of new businesses such as cloud, big data, AI, and IoT, the impor-
tance of data centers for processing the data is increasing. At the
As of 2018, the global data center energy consumption is esti- global level, hyper-scale data centers are proliferating. Hyper-scale
mated to be 205 TWh, accounting for approximately 1.0% of the data centers are massive business-critical facilities designed to
global electricity consumption [1]. This is a 6% increase over that in efficiently support robust, scalable applications and are associated
2010, while the global data center compute instances (cloud-based with big data-producing companies. In general, it operates 100,000
workstations) have grown 550% over the same period. User IT servers and has a scale of more than 20,000 m2 with flexible
behavior will increase the energy needs of data centers by 353 TWh expansion capacity [3]. The recorded number of hyper-scale data
in 2030 [2]. Entering the era of the 4th industrial revolution, data is centers on a global basis was 386 in 2017, an increase of 14.2% from
rapidly emerging as a key growth engine for future development. the previous year. It was expected to increase to 448 in 2018 and
Because a large amount of data is generated owing to the expansion 628 in 2021, an increase of 10.17% compared with that in 2020 [3,4].
Data centers are energy-intensive facilities, with typical power
densities of 540e2200 W/m2 and in recent deployments, they
exceed 10 kW/m2 [5]. In the current data center, when a rack server
Abbreviations: MAC, Modular Air Containment; PUE, Power Usage Effectiveness;
CRAH, Computer Room Air Handling; SHI, Supply Heat Index; RHI, Return Heat that consumes an average of 10 kW or more is configured, it is
Index; RTI, Return Temperature Index; SA, Supply Air; RA, Return Air; M&V, Mea- considered high density [6]. The basic function of data center
surement and Verification; ECM, Energy Conservation Measure; VFD, voltage and cooling is to effectively remove heat from IT equipment by sup-
frequency dependent; VI, Voltage Independent; VFI, Voltage and Frequency Inde- plying and distributing cold air to each rack server. As shown in
pendent; UPS, Uninterruptible Power Supply.
* Corresponding author.
Fig. 1, the design approach is categorized into room-based, row-
E-mail addresses: jinkyun.cho@hanbat.ac.kr (J. Cho), bypark@hanbat.ac.kr based, and rack-based cooling according to the basic configured
(B. Park). rack density to efficiently perform these functions [7].

https://doi.org/10.1016/j.energy.2022.124787
0360-5442/© 2022 Elsevier Ltd. All rights reserved.
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Nomenclature L Latent heat

q Cooling capacity [W] Subscripts and superscripts


Q (A measured value of) air flowrate [m3/h] or water a Air (air-side)
flowrate [liter/min] ch Chilled water (water-side)
t (A measured value of) temperature [ C] cw Condenser water
x Absolute humidity of air [kg/kg’] i Inlet to unit (return)
u Specific volume of air [m3/kg] o Outlet from unit (supply)
Cr Specific heat [Wh/kg C] n The nth measurement point
H Pump water head [m(Aq)] RM IT room
h Equipment efficiency [%] CR CRAH unit
PF Power factor [%] CH Chiller
CT Cooling tower
Superscripts and superscripts F Fan
S Sensible heat P Pump

In 2018, for the world's first commercialization of the 5G mobile [12] on the evaluation of thermal performance and optimization of
communication, all stakeholders including telecom companies, cooling provisioning for the MAC. The second phase is an experi-
manufacturers, and the government made continuous efforts in mental study. The main purpose of this experimental study is to
Korea to introduce changes in the data centers that can accom- evaluate the applicability and cooling performance of the inde-
modate the new industrial environment. Considering that legacy pendent MAC with row-based cooling that can overcome the lim-
data centers might move to a hyper-scale data center in the future, itations of the existing room-based cooling and meet the recent
it is necessary to prepare for a different type of operation than the requirements of the IT environment and energy efficiency. In
existing one. In other words, changes to the infrastructure in the particular, the field application test of the novel in-row cooling
non-IT sector that maintains the operating environment of the data package with multiple heat-transfer medium that is a sequential
center, such as cooling, power distribution, and storage for IT water to refrigerant to air heat exchange system, has been con-
equipment, will inevitably occur. The cooling system is expected to ducted for the first time. Furthermore, the most important differ-
change the most due to the operation of high-density IT equipment. ence from the existing studies is that, while complying with the
The data center air conditioning and cooling market is expected to standard CRAH test method and procedures, the objective cooling
grow by 13.7% from $8.4 billion in 2017 to $23.2 billion in 2025 [8]. performance of the row-based cooling system is derived through
In the hyper-scale data center environment, the average density of in-situ measurement in connection with the actual operation of a
IT equipment (7.0 kW/rack or more) increases; thus, cooling stra- reference data center and the energy contribution is analyzed. In
tegies that can respond to changes in the IT industry are required. A addition, the applicability of the row-based cooling prototype is
conventional cooling strategy, room-based cooling, may no longer evaluated on the basis of power usage effectiveness (PUE). The
be effective. For new data centers, row- and rack-based cooling validity of the PUE analysis has been secured by defining the key
must be considered when applying high-density IT loads (10 kW/ conditions to be considered centrally when applying the row-based
rack or more). In row-based cooling, the in-row computer room air cooling system and by suggesting specific standards for the verifi-
handling (CRAH) units should be placed between rack servers to cation methodology.
provide efficient cooling closer to the IT equipment without heat
loss. This is because the air distribution path is shorter and simpler 2. Literature review
compared with room-based cooling. In addition, the airflow is
predictable and can operate close to the maximum rated capacity of Until now, not many studies have been conducted on energy
the CRAH unit to serve the higher IT power density. Row-based efficiency analysis and optimization of the row-based cooling for
cooling can provide higher IT power density and is suitable for data center. These only focused on the cooling performance. Nada
independent modular air containment pods. For high-density IT et al. [13] found that the row-based layout of CRAH units provides
power in excess of 10 kW/rack, row-based cooling can be imple- better thermal performance than room-based CRAH units, even if
mented without a raised floor [9]. Rack-based cooling is typically they have the same cooling capacity. Nada and Abbas [14] evaluated
used when cooling is required for rack servers up to 50 kW. In this the thermal performance and energy efficiency based on the three
case, the computer room air conditioning (CRAC) unit should be configurations of the row-based cooling system. Shielding the top
integrated in a rack. The airflow path is much shorter than that of of the hot/cold aisle prevents air re-circulation that can reduce the
row-based cooling, and the airflow is not affected by the rack layout hot-spot air temperature rise by 5  C. Installation of CRAHs at both
in the IT room [10]. In the past, the IT environment of commercial ends in the aisle effectively prevents air by-pass. The shielding of
data centers was projected to grow to an average of 30 kW/rack. the top and side of the hot/cold aisles improves the energy utili-
Across the ICT industry, these expectations have rarely been met, zation index (hr ) from 1.25 to 1.95. Abbas et al. [15] compared the
and the IT power density is typically 15 kW/rack [11]. Most rack performance of room-based and row-based cooling strategies
servers in today's data centers operate at 5e10 kW/rack. Therefore, based on IT power density through numerical simulations. Their
row-based cooling systems are undoubtedly the most suitable results showed that row-based cooling is preferred over room-
cooling solution for data centers. based cooling for high power densities. Moazamigoodarzi et al.
In this study, in order to improve the cooling efficiency of high- [16] found that row-based cooling reduces the cooling energy by
density data centers, a prototype of independent modular air ~29% over room-based cooling by comparing the amount of the
containment (MAC) applying row-based cooling has been devel- cold air flow rate required at fixed supply air temperatures. Moa-
oped. The first phase had been completed for a numerical study zamigoodarzi et al. [17] also developed a model to predict the air
2
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

improved from 0.79 to 0.31, and the difference in PUE before and
after the change was 0.27 that was expected to save an annual
energy of 16.9%. Furthermore, Gupta et al. [19] compared different
cooling strategies based on exergy and found that row-based
cooling leads to a lower exergy destruction than room-based
cooling.
As previous researches directly related to this study, Cho and
Woo [20] evaluated the thermal performance of the existing room-
based cooling and the row-based cooling through the remodeling
project of the existing legacy data center. This pilot study was able
to evaluate two different types of cooling systems under the same
IT environment conditions. The supply heat index (SHI) and return
heat index (RHI), evaluations of the air conditioning performance of
data centers, improved by 37.1% and 20.0%, respectively, and return
temperature index (RTI) and beta index (b) improved by 73.2% and
44.3%, respectively. In another numerical study, Cho and Kim [12]
developed an independent MAC prototype that has been investi-
gated in this experimental study. In consideration of in-row CRAHs
and cold/hot aisle layout, a total of 2592 different matrix combi-
nations were implemented to optimize the row-based cooling.
The above literature review reveals that there a gap in the
research comparing room-based and row-based cooling strategies
of data centers, and most studies have performed comparisons
through numerical analyses. No experimental study on the cooling
performance of row-based systems is available. Furthermore, the
safety of the operation according to the configuration conditions of
row-based cooling has not been considered. In row-based cooling,
theoretically, there is no problem in realizing that the installation
location of the CRAH is moved from perimeter (room-based cool-
ing) to the inside of the white space closer to the IT equipment.
However, if a problem with the cooling system occurs in the white
space where the IT equipment is installed, the water damage may
affect the IT equipment; thus, the utilization is not large compared
with the effect. The system configuration should be reviewed first
because the water pipe that is the infrastructure physically
accompanying the CRAH unit, can provide a decisive obstacle to the
operation of the data center due to damage. The primary contri-
bution of this study is that, in addition to improving the cooling
performance and energy efficiency, the developed row-based
cooling system considers the safety of the cooling system without
adversely affecting the data center operation in an emergency.

3. Independent MAC with row-based cooling

3.1. Test mock-up of MAC prototype

In order to maximize the advantages of row-based cooling, the


developed independent MAC focused on minimizing the heat loss
by shortening the cold air distribution path. In addition, to prevent
the air re-circulation and air bypass phenomenon, a complete
containment structure has been implemented that shields both the
cold aisle and the hot aisle. Here again, to increase the airtightness
of the server racks to be installed at the boundary between the cold
Fig. 1. Floor plans illustrating the basic concept of room-, row-, and rack-based cooling.
and hot aisles, all the blanking panels that block the space and gap
between the cable port and the mounted IT server has been applied.
temperature inside IT servers and evaluated the influence factors of The cooling strategy of this MAC can adapt the cold-hot-cold
the cooling system on energy consumption. This methodology was (CeHeC) or hot-cold-hot (HeCeH) direction to increase the air
verified experimentally and linked with the energy consumption distribution efficiency. In terms of air distribution efficiency, the
calculation. The influence of various factors such as IT server CeHeC arrangement is more efficient; nonetheless, the cold air
configuration, number of CRAH, operating conditions, air temper- distribution path is extremely short, and therefore, the deviation is
ature distribution, and total cooling energy consumption was approximately 10% [12]. As shown in Fig. 2, the HeCeH arrange-
evaluated, and it can be used to optimize the design of row-based ment has been applied to the MAC for this demonstration owing to
cooling. Jin et al. [18] also compared the energy performance of the restrictions of the installation conditions of the white space.
the existing room-based cooling and the row-based cooling. SHI The test results are considered to be sufficient to draw conclusions
extending to the CeHeC arrangement.
3
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Fig. 2. Test mock-up of independent modular air containment system (HeCeH layout).

3.2. Row-based cooling strategy As shown in Fig. 3a), the reason behind the difficulty of applying
a central chilled-water-type CRAH unit is the passing of the sup-
The row-based cooling system should be installed in the white plied chilled water pipe through the white space; therefore, if a
space where the CRAHs are installed between rack servers oper- problem occurs in the pipe, it directly affects the IT equipment.
ating at higher power density, to cool the IT equipment efficiently. Actually, this is in an indirect manner related to many situations
In general, the application of an air-cooled type CRAC unit was first where personnel have to enter the white space that requires highly
considered for this in-row type cooler. The major reason is to security in order to maintain or repair the pipe system. Cooling coil
fundamentally prevent water damage of the white space in the IT condensate is an important aspect of row-based cooling system
room. design and should be carefully considered to avoid major issues in

Fig. 3. Mission critical operation in data centers; a) water damage risk of conventional central chilled water system with an existing in-row CRAH and b) a novel in-row cooling
package with sequential water to refrigerant to air heat exchange system.

4
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

the future. An important detail to consider when installing a CRAC/ efficiency, is based on annual cumulative power usage, utilizing
H unit in the white space is the drain of the condensate. The short-term field test results for predicting PUE is the key. The first
movement of the drain pipe inside the white space must also be step, that is, the performance evaluation of row-based cooling must
considered. The air-cooled-type CRAC unit that uses refrigerant- be preceded by ensuring the reliability of the results obtained by
based cooling and cools indoor air using a condensed refrigerant the standard testing method. Therefore, the field testing method is
liquid, requires condensate drain and an available return plenum, divided into the performance test of the CRAH unit and the energy
and the risk of fundamental water damage cannot be resolved. efficiency evaluation of the row-based cooling system of the in-
Although row-based cooling is superior to the existing room-based dependent MAC.
cooling method in terms of air distribution and could improve the
efficiency, it cannot be universally applied owing to the afore-
mentioned limitations. 4.1. Method of testing for rating of CRAHs
In order to overcome this limitation, the in-row cooling package
has been developed to include water to refrigerant primary cycle In general, data centers (IT room) maintain constant air tem-
and refrigerant to air secondary cycle, such that the primary chilled perature and humidity to maintain an appropriate operation con-
water pipe may not directly pass through the white space (in the IT dition for protecting IT equipment. Therefore, it is necessary to first
room). Because the secondary refrigerant liquid pipe in the white check the target IT environment condition and evaluate the
space uses only sensible heat without causing any phase change, appropriate size and accurate performance of the CRAH unit to
condensate drain is not necessary. As shown in Fig. 3b), the features maintain this condition. All IT equipment must operate within
of row-based cooling system constituting this independent MAC recommended equipment environmental specifications for air
are the efficient cooling distribution in each IT equipment and the cooling. Table 1 shows the equipment environmental specification
development of a new in-row cooling package with multiple heat- of the data center according to the IT environmental classes pre-
transfer medium that is a sequential water to refrigerant to air heat sented by ASHRAE TC9.9 [21]. Because most of the high-density
exchange system for improving cooling efficiency and safety. data centers fall under A1 and it is important to maintain the IT
Therefore, it is a row-based cooling system that is optimized to environment according to the recommended guideline, it is
operate the refrigerant liquid at a temperature that does not cause essential to comply with this standard for performance evaluation
condensate drain. Fig. 4 shows a test mock-up system of row-based of the CRAH units.
cooling in the MAC prototype. For the testing methods of the data center cooling system, such
standards have been well-formulated to evaluate the performance
rating of CRAH considering the IT environment. The application
4. Methodology scope of the ANSI/ASHRAE Standard 127 [22] is a representative
standard for evaluating the performance rating CRAH that includes
In-situ measurement of the row-based cooling system in the chilled water, air-cooled (direct expansion), water-cooled, and
independent MAC, the cooling performance evaluation of the glycol-cooled units.
sequentially multiple heat exchange system, is the main technical Table 2 shows the testing conditions for rated cooling, reheating,
factor. This is because it is the most important part of predicting the humidification, and dehumidification of the CRAH unit. Class 1e4
target PUE of the data center. Because PUE, the data center energy classifies the application conditions of CRAH that are presented

Fig. 4. Test mock-up system of row-based cooling: sequential multiple-heat exchange system.

Table 1
ASHRAE data center thermal guidelines for IT environment [21].

Class Equipment environmental specifications for air cooling (product operation)

Dry-bulb temperature [ C] Humidity range (non-condensing) Maximum Maximum Maximum temperature


dew-point [ C] elevation [m] change [ C/h]

Recommended A1eA4 18e27 -9e15  C (DP) and 60% (RH) e e e


Allowable A1 15e32 12e17  C (DP) and 8e80% (RH 17 3050 5/20
A2 10e35 12e21  C (DP) and 8e80% (RH) 21 3050 5/20
A3 5e40 12e24  C (DP) and 8e85% (RH) 24 3050 5/20
A4 5e45 12e24  C (DP) and 8e85% (RH) 24 3050 5/20
B 5e35 Under 28  C (DP) and 8e85% (RH) 28 3050 N/A
C 5e40 Under 28  C (DP) and 8e85% (RH) 28 3050 N/A

5
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Table 2
Standard rating conditions of CRAHs (ASHRAE) [22].

Type of CRAH Conditions Application classes Rated cooling

Test A Test B Test C Test D



Air temperature surrounding CRAH (control is Return dry-bulb temperature [ C] Class 1 23.9 23.9 23.9 23.9
on return temperature) Class 2 29.4 29.4 29.4 29.4
Class 3 35.0 35.0 35.0 35.0
Class 4 40.5 40.5 40.5 40.5
Return dew-point temperature [ C] 11.1 11.1 11.1 11.1
Air-cooled units: temperature surrounding Dry-bulb temperature [ C] 35.0 26.7 18.3 4.4
remote air-cooled condenser
Water-cooled units (connected to cooling Entering water temperature [ C] 28.3 21.1 12.8 1.7
tower) Leaving water temperature [ C] 35.0 e e e
Fluid flow rate Max ¼ Test A
Glycol-cooled units (connected to a common Entering glycol temperature [ C] 40.0 29.4 18.3 1.7
glycol loop with a Leaving glycol temperature [ C] 46.0 e e e
solution of 40% propylene glycol by volume) Fluid flow rate Max ¼ Test A
Chilled-water units Entering water temperature [ C] e e 10.0 e
Leaving water temperature [ C] e e 16.7 e
Reheating Base Rating
All units Return dry-bulb temperature [ C] 23.9
Steam reheat units Entering steam supply conditions [ C] 121 (100 kPag)
Hot-water reheat units Water temperature entering unit [ C] 80.0
Water temperature leaving unit [ C] 70.0
Humidification/Dehumidification Humidification Dehumidification
All units Return dry-bulb temperature [ C] 29.4 29.4
Return dew-point temperature [ C] 5.5 15.0
Return relative humidity [%] e 60
Steam humidifier units Entering steam supply conditions [ C] 105 (19 kPag) e

step-by-step to enable response to various air flowrates. CRAH is side can be expressed as the amount of heat that can be removed
applicable only to 289 m3/h per kW or less for Class 1 and 2, and from the IT equipment when CRAH units are operated. An in-row
221 m3/h per kW or less for Class 3 and 4. Test A is the baseline, and CRAH unit is composed of four secondary refrigerant to air micro-
Tests B to D are used to evaluate the rated performance using pin-fin heat exchangers and four EC fans. The air temperature
weights that reflect the local outdoor temperature. The perfor- and humidity were measured at eight points at the SA and RA side
mance test of the chilled water CRAH unit is based on Test C. of the CRAH. The air-side cooling capacity of a CRAH was based only
Another performance testing standard, AHRI Standard 1361 on the sensible heat because there was no source of latent heat, and
[23], is shown in Table 3, and the application target for each type of calculated according to Equation (1) determined in cold aisle. The
CRAH unit is similar to that of the ANSI/ASHRAE Standard 127. water-side cooling capacity of a refrigerant distribution unit was
However, the CRAH units have been classified by the mounting calculated using Equation (2) as the amount of heat supplied
location, and the return air temperature control conditions have through chilled water.
been detailed in consideration of the air supply direction.
 
CrðaÞ  Qa  taðiÞ  taðoÞ
4.2. In-situ measurement procedures and methods qSa ¼ (1)
vn  ð1 þ xn Þ

An independent MAC prototype with row-based cooling system  


was installed in a telecom data center in Korea. Cooling perfor- qcw ¼ CrðcwÞ  Qcw  tcwðiÞ  tcwðoÞ (2)
mance was evaluated before the rack servers were mounted in
consideration of the safety of IT room during operation. Consid- As shown in Fig. 5 b), a refrigerant distribution unit comprises
ering the high-density IT environment conditions, the testing one primary water to refrigerant heat exchanger and one refrig-
methods and procedures of the cooling performance were erant liquid pump. The water temperature and flowrate were
confirmed, and the in-situ measurement of the in-row cooling measured at the chilled water supply and return points. The final
package comprising six indoor units and two refrigerant distribu- evaluation item is the total energy efficiency ratio (EER). Perfor-
tion units was conducted under actual operation conditions in the mance evaluation of the row-based cooling system was performed
reference data center. The field test measured the cooling perfor- in accordance with the ANSI/ASHRAE Standard 127 and AHRI
mance (rated capacity) of the CRAH units and the refrigerant dis- Standard 1361 testing procedures. The detailed setting criteria were
tribution units simultaneously. Each measurement item was rationally changed by reflecting the data center operation condition
simultaneously measured in real time, such as (air-side) tempera- of the cooling system. All testing procedures were officially con-
ture and relative humidity of the supply air (SA) and return air (RA), ducted in the presence of a third party accredited organization.
air flowrate, (water-side) temperatures of chilled water, water Fig. 6 shows the in-situ measurements and experimental setup for
flowrate, and power consumption. Because the measurement point row-based cooling system in the MAC pod to measure the input
on the refrigerant-side is not specified in the test standard, it is power, air temperature in hot/cold aisle, air flowrate of CRAH, and
predicted as the average of the cooling performance on the air-side the supply chilled water flowrate and temperature difference of the
and the water-side. To simulate the test environment reflecting the refrigerant distribution unit for the evaluation of the cooling per-
IT operating conditions, a total of 180 kW of heat load banks was formance described above. Basically, this refrigerant distribution
installed to realize the power load of IT equipment (approximately unit is a chilled water unit that is composed of an inverter
5.0 kW/rack). As shown in Fig. 5 a), the cooling capacity of the air- compressor, an electromagnetic expansion valve, a condenser using
6
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Table 3
Standard rating conditions of CRAHs (AHRI) [23].

Type of CRAH Mounting location Standard model Cooling Humidification

Dry-bulb Dew-point Dry-bulb Dew-point


temperature [ C] temperature [ C] temperature [ C] temperature [ C]

Indoor return air temperature Ceiling mounted unit Ducted 24.0 11.0 24.0 5.6
Non-ducted 24.0 11.0 24.0 5.6
Floor mounted unit Up-flow (non-ducted) 29.5 11.0 24.0 5.6
Up-flow (ducted) 29.5 11.0 24.0 5.6
Down-flow 29.5 11.0 24.0 5.6
Horizontal-flow 35.0 11.0 24.0 5.6
System type Fluid condition Test condition
Heat rejection/cooling fluid Air-cooled units (DX) Entering outdoor ambient dry-bulb temperature [ C] 35.0
Water-cooled units (w/cooling tower) Entering water temperature [ C] 28.5
Leaving water temperature [ C] 35.0
Water flow rate N/A
Glycol-cooled units (glycol loop) Entering glycol temperature [ C] 40.0
Leaving water temperature [ C] 46.0
Water flow rate N/A
Glycol solution concentration [%] 40% propylene glycol by volume
Chilled-water units (chilled water loop) Entering water temperature [ C] 10.0
Leaving water temperature [ C] 16.5

Fig. 5. Measuring instrument points and sensor installation for cooling performance.

a plate heat exchanger, and an evaporator. than a single measurement. Finally, it is important to conduct the
As shown in Table 4, the base testing conditions ensured the test after the accuracy of the instrument related to Type-B has been
operation of the system to maintain the air temperature in the cold verified. All instruments were calibrated within six months, and the
aisle below 25.0  C, and the inlet supply chilled water temperature measurement uncertainty was calculated by reflecting the error
was 8.0  C. The testing conditions complied with ANSI/ASHRAE rate according to the calibration standard, and the acquired data
standards, and the chilled water supply temperature was based on was adjusted according to the procedure. Table 5 shows the
the cooling plant of the reference data center. detailed information of the equipment used for the in-situ
measurement.
4.3. Securing reliability of in-situ measurement
5. Experimental investigation
The appropriateness of the cooling performance test of the row-
based cooling system guarantees the reliability of the result value. 5.1. Measurement strategy and condition-setting
In-situ measurement is subject to errors and uncertainties. The
causes of these errors and uncertainties pertain to the accuracy of The experimental investigations were divided into the mea-
the instruments, the stability of the measurement target, the ra- surements of the cooling capacity for each component unit and the
tionality of the test methods and procedures, the proficiency of the evaluation of the overall cooling performance of the row-based
operator, and the measurement environments. The estimation of cooling system. As shown in Fig. 6, a row-based cooling system
the cause of measurement uncertainty can generally be divided with three in-row CRAH units and one refrigerant distribution unit
into a statistical approach Type-A and an information-based sys- for row-A were evaluated, and simultaneously, the cooling capacity
tematic approach Type-B [24]. Type-A relates to the acquisition and test was performed on only an in-row CRAH (#2) and a refrigerant
method of measured data and can be sufficiently overcome through distribution unit (#1) in this component system. A preliminary test
the in-situ measurement procedure and method defined above. In was performed for approximately 120 min to secure the operation
addition, it is possible to secure the reliability of the acquired data stability of the IT environments and cooling system. After trial-run,
and the calculated cooling performance value because it is derived the main test was continuously measured every 1 min for 75 min.
through statistical processing as a series of measurements rather Because the standard of the accredited test for cooling performance

7
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Fig. 6. In-situ measurements and experimental setup for row-based cooling system in the MAC pod.

Table 4
Test condition of cooling performance for the row-based cooling system.

Test condition of (Equipment) Unit capacity Qty. [EA] Total system capacity Type of Chilled water Set temperature in cold
[kW] [kW] refrigerant temperature [ C] aisle [ C]
Row- Row-
A B

IT operations Load bank 11.25 8 8 180 e e e


Row-based In-row CRAH unit 45 3 3 270 R-134a e 25.0 or less
cooling Refrigerant distribution 135 1 1 270 R-134a 8.0 e
unit

Table 5
Measuring instruments and measurement range.

Measuring equipment Items Descriptions

Air-side Air flow capture hood Parameter Air flow rate


TESTO/420 (50607007) Accuracy ±3% of m.v. (þ7 CFM)
Resolution 1.0 CFM
Response time Flow velocity/Volume flow: approximately 1 (s)
Data logger Parameter Air temperature and humidity
T&D/TR-72WF Accuracy ±0.5%
Resolution 0.1  C
Responsiveness Response time (90%): approximately 7 (min)
Water-side Ultrasonic water flow meter Date of calibration October 2020.
GE/PT900 (M05180006) Parameter Water flow rate
Accuracy ±2% of reading
Repeatability ±0.2% of reading
Response Time Up to 2 Hz
Multi data logger Parameter Water temperature
GRAPHTEC/GL840 (C71018812) Accuracy 0.1% of rdh. þ0.5  C (Thermocouple T type)
Sampling interval 10 min to 1 h
Energy Power logger Parameter Power consumption
HOIKO/PW3365-21 (180546416) Accuracy Active power: ±2.0% rdg. ±0.3% f.s.
Display update rate 0.5 s
Data save interval 1 s to 60 min (14 selections)

is to measure three times at 10-min intervals in principle [24], the 5.2. Cooling capacity of each component unit
real-time continuous measurement over 75 min satisfied the
relevant conditions and there was no problem encountered in data The rating test for air-side cooling performance of an in-row
acquisition. CRAH unit and water-side cooling performance of a refrigerant
Fig. 7 shows that the air temperature in the cold and hot aisles of distribution unit was based on the ANSI/ASHRAE Standard 127.
row-A during the measurement period, and the supply and return Table 6 shows the measurement results of the individual cooling
temperatures of the chilled water for the refrigerant distribution capacity of the CRAH unit (#2) and refrigerant distribution unit
unit (#1) are maintained as constant. It can be seen that the (#1). However, because the chilled water supply condition of the
experiment was performed in the stabilization stage as Table 4. central chiller system installed in the reference data center to be

8
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

individual cooling rating test of the equipment, the in-row cooling


package presented the output of the cooling performance corre-
sponding to the assigned IT load, and the refrigerant distribution
unit also demonstrated suitable cooling capacity and performance
to remove IT heat through the chilled water to the refrigerant heat
exchanger.

5.3. Overall cooling performance of an in-row cooling package

First, the in-row cooling package is extremely important to


maintain the thermal equilibrium between the primary and sec-
ondary heat exchange cycle. As shown in Table 7 and Fig. 8, as a
result of evaluating the cooling performance of a row-based cooling
system based on one zone (row-A), it was possible to supply
cooling, such that it exceeded the given IT load (heat load). The
water-side cooling capacity of heat removal may differ depending
on the chilled water temperature, but it maintains the thermal
balance between the water to refrigerant primary cycle and the
refrigerant to air secondary cycle. In addition, the cold aisle in row-
A satisfies the ASHRAE thermal guideline by converging the air
temperature to 23.1  C based on the supply chilled water temper-
ature of 7.9  C under the IT load of 61.1 kW, where the IT cooling
load was assumed to be 95% of the IT load (power).
It is expected that the cooling performance will decrease with
the increase in the supply chilled water temperature; nonetheless,
Fig. 7. Stabilization conditions of IT environment (cold/hot aisle) and chilled water no problem was encountered while maintaining the appropriate air
supply during the measurements. temperature in the cold aisle up to 25  C. The net power con-
sumption of CRAH fans and refrigerant liquid pump is an average of
1.17 kW, based on the IT power of 61.1 kW, excluding the energy of
tested was already in cooling operation in connection with other IT
the central chilled water system including the chiller plant and
rooms, the test was conducted under a fixed condition around
chilled water circulation pump.
8.0  C. In addition, according to the chilled water temperature
Cooling partial PUE (pPUECooling ) focuses on the energy efficiency
testing condition, the return air dry-bulb temperature of the CRAH
of a particular portion of the cooling system. This may be needed
unit was 35  C, but the dew-point temperature was inevitably
because some measurements of the total data center energy are
adjusted downward, and the return air wet-bulb temperature
unobtainable, and the cooling energy inside a boundary is divided
condition was also slightly adjusted. The total IT power supplied to
by the IT equipment energy inside the boundary. [25] The effect of
the independent MAC was approximately 130 kW (72% of total heat
the calculated cooling partial PUE calculated using Equation (3), is
bank capacity) and the IT load assigned to the row-A was expected
almost insignificant with an average of 0.019 (19.22 W/IT-kW). It
to be approximately 65 kW (heat load bank). The IT load assigned to
has been analyzed that the in-row cooling package with primary-
the in-row CRAH unit (#2) that controlled 55% of supply air flow-
secondary heat exchange system overcomes the most significant
rate considering the fan speed, was expected to be approximately
limitation of the existing chilled water type in-row cooling system,
13 kW. In addition, the IT load allocated to the refrigerant distri-
where the row-based cooling system undertakes the risk of con-
bution unit (#1) was set to 60 kW in consideration of the rated
necting the chilled water pipe to the white space of the IT room, and
cooling capacity of the actual CRAH unit. As a result of the
also does not have any significant effect on energy.

Table 6
Results of cooling performance test of the component units.

Test conditions Application Cooling units Items Results

Water- Entering water temperature [ C] 7.0 ± 1.0 In-row CRAH #2 (surrounding indoor part of Cooling capacity [W] 12,083
side Due to reference central cooling system condition CHW 10.0 / 8.0  C unit) Electrical energy [W] 100.51
supply temperature Air flow rate [m3/h] 2052
Air-side Return dry-bulb temperature [ C] 35 ± 1.0 Return dry-bulb 34.21
temperature [ C]
Return wet-bulb temperature [ C] 17.4 ± 1.0 Return wet-bulb 17.57
temperature [ C]
Due to CHW supply temperature condition WB 19.8  C/ WB Supply dry-bulb 16.67
17.4  C temperature [ C]
Supply wet-bulb 10.92
temperature [ C]
System Fan speed [%] 35 Refrigerant distribution unit #1 (heat rejection Cooling capacity [W] 59,248
setting Refrigerant pump [Hz] 28 and cooling fluid) Electrical energy [W] 633.44
IT load IT load in row-A [kW] 65 Water flow rate [L/min] 161.4
Assigned IT load for CRAH #2 [kW] Approximately 13 Entering water 7.83
temperature [ C]
Assigned IT load for RD #1 [kW] Approximately 60 Leaving water 12.61
temperature [ C]

9
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Table 7
Descriptive statistics of cooling performance; row-A with an in-row cooling package.

Variable (row-A results) N Mean StDev Minimum Q1 Median Q3 Maximum

Air temperature in cold aisle [ C] 75 23.081 0.155 22.708 22.975 23.158 23.200 23.450
Air temperature in hot aisle [ C] 75 34.645 0.231 34.100 34.450 34.750 34.800 34.925
CRAH (#1) SA temperature [ C] 75 19.663 0.110 19.400 19.633 19.700 19.700 20.300
CRAH (#2) SA temperature [ C] 75 19.596 0.144 19.340 19.417 19.640 19.700 20.000
CRAH (#3) SA temperature [ C] 75 20.626 0.117 20.283 20.500 20.700 20.700 20.767
CRAH (#1) RA temperature [ C] 75 36.495 0.165 36.117 36.350 36.533 36.600 36.800
CRAH (#2) RA temperature [ C] 75 34.713 0.237 34.117 34.500 34.800 34.880 35.000
CRAH (#3) RA temperature [ C] 75 35.552 0.239 35.000 35.400 35.583 35.800 35.800
RDU (#1) CWS temperature [ C] 75 7.9330 0.1106 7.7167 7.8500 7.9667 8.0000 8.1800
RDU (#1) CWR temperature [ C] 75 12.103 0.110 11.900 12.017 12.150 12.200 12.217
CRAH (#1) fan power [kW] 75 0.1946 0.000320 0.19376 0.19436 0.19459 0.19480 0.19521
CRAH (#2) fan power [kW] 75 0.1044 0.000195 0.10393 0.10424 0.10441 0.10452 0.10493
CRAH (#1) fan power [kW] 75 0.1914 0.000253 0.19078 0.19124 0.19145 0.19161 0.19188
RDU (#1) pump power [kW] 75 0.6828 0.00214 0.67683 0.68147 0.68243 0.68425 0.68742
CRAH (#1) cooling capacity [kW] 75 25.272 0.715 21.132 25.279 25.389 25.486 25.788
CRAH (#2) cooling capacity [kW] 75 12.408 0.288 10.381 12.397 12.453 12.537 12.610
CRAH (#1) cooling capacity [kW] 75 22.309 0.858 18.597 22.319 22.476 22.708 22.940
RDU (#1) cooling capacity [kW] 75 65.302 1.139 62.853 64.631 65.474 65.906 69.020
IT Power [kW] 75 61.050 e 61.050 61.050 61.050 61.050 61.050
IT cooling load [kW] 75 57.998 e 57.998 57.998 57.998 57.998 57.998
Cooling power [kW] 75 1.1732 0.00222 1.1678 1.1719 1.1729 1.1747 1.1780
pPUECooling 75 0.0192 0.000036 0.0191 0.0192 0.0192 0.0192 0.0193

Fig. 8. Results of cooling performance evaluation for row-based cooling system (at row-A).

accuracy of the baseline energy consumption affect the calculation


Cooling energy of PUE; thus, it is extremely important in terms of reliability of
pPUECooling ¼ (3)
IT equipment energy M&V. For the baseline system model of the reference data center,
three energy categories were set based on power usage, and the
row-based cooling system model was evaluated. PUE is defined as
the ratio of the total facilities energy to IT equipment energy, as
6. Data center energy efficiency analysis expressed by Equation (4) [25]. It also refers to the degree of energy
efficiency of non-IT system based on IT system of a certain IT load.
It can be said that the measurement and verification (M&V) The energy baseline of a data center can be defined based on the
comprises a series of verification procedures for the measurement baseline system model constituting the IT and non-IT systems.
and evaluation of the energy performance and saving effect of a
Total facility energy
reference data center by considering the interaction of energy PUE ¼ (4)
conservation measures (ECMs) applied to improve the cooling en-
IT equipment energy
ergy efficiency [26]. In this M&V procedure, the energy efficiency
improvement by the row-based cooling system should be deter-
mined based on PUE. Because the annual energy efficiency cannot 6.1. Energy baseline for non-IT system
be directly measured, the estimated PUE can be derived by
measuring and comparing the baseline power consumption data As shown in Fig. 9, the energy consumption structure of a data
before and after the application of the row-based cooling system. In center is divided into three parts: (1) IT power, (2) cooling energy,
this process, both the validity of the baseline system model and the and (3) power distribution loss [27,28]. Energy baseline for the non-
10
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Fig. 9. Energy flow of a large data center and system configuration (modified from Ref. [27]).

IT sector of the data center for M&V is not yet clear. Title 24 energy In case of the constant air volume CRAH unit, the SA tempera-
code [29] is the only standard that has been established as a ture adjusts between at least 15e26  C to meet the IT load. Variable
baseline cooling system for data center [30]. For the cooling energy air volume CRAH linearly resets the air volume from 100% to at least
based on criteria, the process of developing an energy model is 50% of the design air volume. When the central chilled water sys-
simplified. In addition, the baseline power distribution system tem is applied, the chiller determines its capacity to meet 115% of
adopted is in line with the US energy star [31] requirements. the total CRAH cooling coils (Equation (8)).

6.1.1. Cooling system qCH ¼ QCR  115% (8)


Data center cooling systems are classified based on the total
Screw chillers can lower the design capacity by 15%, and cen-
cooling load of 850 kW. Accordingly, depending on the total IT load
trifugal chillers can lower the design capacity by 10%. Table 8 shows
of 850 kW or more, the baseline cooling system is categorized into
the baseline chiller efficiency requirements that vary with the
air-cooled (DX) CRAC and central chilled water CRAH. The tem-
chiller type and size. The chilled water supply temperature of the
perature difference between the SA and RA of room-based cooling
baseline chiller is 6.7  C, and it can be changed from 6.7  C to a
is determined based on the design load of DT ¼ 10  C. All baseline
maximum of 12.2  C according to the cooling demand. Further-
cooling systems are specified with a minimum SA temperature of
more, the differential pressure is reset according to the cooling
15  C. In addition, the baseline IT room is designed to maintain the
demand.
indoor air temperature (of the cold aisle with containment) at 26  C
The baseline cooling tower is an open circuit cooling tower. A
that is the Title 24 energy modeling requirement. For data centers
single-cell cooling tower is connected 1:1 with each baseline
where the design temperature exceeds 26  C, both the baseline
chiller. Each cooling tower determines its size to meet the design
system and the proposed system are modeled to operate at the
cooling load at the design wet bulb temperature. The condenser
design temperature of 26  C. The CRAH cooling coil capacity is
water temperature is set to 29.4  C or the design wet bulb tem-
determined using Equation (5) to supply 120% of the design load:
perature plus 10  C, whichever is lower, and the cooling tower
qCR ¼ qRM  120% (5) range is 10  C. The cooling tower fan has a size of 0.75 kW based on
a cooling water flow rate of 230 LPM. The cooling tower fan
Using Equation (6), the SA volume is sized to meet 120% of the maintains the condenser water supply temperature with variable
design cooling load at DT ¼ 10  C between SA and RA. Fan static speed fan control (Equation (9)).
efficiency of 13.8 W/CMM is applied (Equation (7)).
qCT ¼ 3:34,QPðcwÞ (9)
QF ¼ 49:4,qCR ÷DT  120% (6)
The chilled water pump is a variable primary flow pump system.
qF ¼ 13:8,Qf (7) The chilled water flowrate can be reduced to 30% of the design
water flow. For the number of pumps, one chilled water pump is
11
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

applied to each baseline chiller, and the design flowrate (DT ¼ 10  C redundancy requirements. For example, in a 2-N facility, the
after passing through the evaporator) determines the size of the baseline UPS quantity would be 2  UPSreq . All baseline UPSs,
pump to supply 4.5 LPM/RT based on the chiller size. The pump including redundant units, are equally sized and run simulta-
head is 12 mAq þ9 mmAq/RT, and the upper limit is 30.5 mAq that neously. In general, as shown in Fig. 10, the amount of power loss is
generally ensures 70% efficiency (Equation (10)). calculated by applying the UPS efficiency according to the IT load
factor [32,33].
QPðchÞ  HPðchÞ  120%
qPðchÞ ¼ (10)
6120,h pðchÞ
6.2. Reference data center
Condenser water pumps are constant-flow systems. One unit is
applied to each baseline chiller and cooling tower. The design flow There is a limit to the data center energy efficiency analysis
rate (DT ¼ 7  C after passing through the condenser) determines during operation. This is row-based, and because the configuration
the size of the pump to supply 7.6 LPM/RT based on the size of the of IT equipment is continuously changed, it is impossible and
chiller. The pump head is 14 mAq, and the pump efficiency is meaningless to evaluate the efficiency of the changed part in the
generally 70% (Equation (11)). non-IT system linked to the entire system centrally. Therefore, it is
important to select a row-based baseline system that can respond
QPðcwÞ  HPðcwÞ  120% to the corresponding IT load. As a measure to achieve energy effi-
qPðcwÞ ¼ (11)
6120,h PðcwÞ ciency in the area being changed, it is possible to derive the annual
PUE by implementing an energy flow structure and using statistical
techniques to calculate the contribution of the cooling energy from
the central chiller plant, to derive the row-based IT power and
6.1.2. Power distribution system energy consumption in the non-IT sector. A baseline energy system
The types of uninterruptible power supply (UPS) are divided model was implemented based on the same reference data center
into three categories: voltage and frequency dependent (VFD), as above. An IT room with room-based cooling system is in oper-
voltage independent (VI), and voltage and frequency independent ation with a total of four floors, and the actual measured PUE is 1.75
(VFI). The maximum average efficiency differs according to the UPS as of 2018. First, the PUE of the reference data center was estimated
type, UPS size, and the communication capability. Table 9 shows the by the aforementioned method and compared with the actual PUE.
baseline efficiency for each category of UPS, and the uptime ratio Next, the PUE was compared considering the row-based cooling
under each load condition used to define the average efficiency
differs depending on the UPS.
Baseline UPS selection determines the number of UPSs required
to meet the design IT load with a safety factor of 1.2 based on the
maximum UPS capacity of 750 kVA, as in Equation (12). The same
power factor is used to determine the capacity of the proposed UPS.
In facilities where the protected load (including the safety factor)
does not exceed 750 kVA, the UPS size is determined as equal to the
protected load rounded to the nearest 50-kVA increment if the
protected load exceeds 100 kVA or to the nearest 10-kVA increment
if the protected load is less than 100 kVA.

IT load  1:2
UPSreq ¼ Round up (12)
PF  750 kVA
The number of installed UPSs is adjusted according to the facility Fig. 10. Efficiency distribution for UPS and typical operation.

Table 8
Baseline chiller efficiency for data center.

Chiller type Size classification Efficiency requirements

Water cooled, electrically operated positive displacement (Screw/Scroll) 75 RT 0.800 kW/RT 0.600 IPLV
75e150 RT 0.790 kW/RT 0.586 IPLV
150e300 RT 0.718 kW/RT 0.540 IPLV
300 RT 0.639 kW/RT 0.490 IPLV
Water cooled, centrifugal (Turbo) 150 RT 0.639 kW/RT 0.450 IPLV
150e300 RT 0.639 kW/RT 0.450 IPLV
300e600 0RT 0.600 kW/RT 0.400 IPLV
600 RT 0.590 kW/RT 0.400 IPLV

Table 9
Baseline efficiency for each category of UPS unit and the runtime fraction under each load condition.

Rated output power (W) Types of UPS Load conditions

VFD VI VFI 25% 50% 75% 100%

1500 < P  10,000 irrespective of communication capabilities 0.970 0.967 0.0099  ln(p)þ0.815 0.0 0.3 0.4 0.3
P > 10,000 without communication capabilities 0.970 0.950 0.0099  ln(p)þ0.805 0.25 0.50 0.25 0.00
P > 10,000 with communication capabilities 0.960 0.940 0.0099  ln(p)þ0.795

12
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

system in the baseline system. Table 10 shows the outline of the of 18.3 mAq. Furthermore, the applied condenser water pump
target data center. power was 20.9 kW corresponding to a flow rate of 5320 L/min and
a pump head of 14 mAq. Finally, 17.8 kW, the cooling tower fan
6.3. Estimation of baseline PUE power, was reflected to calculate the energy required for the
cooling system. For the cooling system, 20% weight was added in
Non-IT energy baseline identification consists of steps for the consideration of the equipment efficiency degradation and heat
determination of the system performance, operating factors loss in pipes and ducts. The UPS power distribution efficiency (94%)
affecting the energy consumption, and the annual energy con- according to the IT load condition was applied to determine the
sumption through measurement. Baseline conditions include power loss of 107.9 kW, and the energy used in lighting was
physical, operational, and energy use data for facilities and systems. assumed and added up. Finally, for the power loss in the trans-
In order to assess the energy saving effect of a row-based cooling former and switchgear responsible for the total input power of the
system, it would be efficient to evaluate it by comparing the energy data center, the power distribution efficiency (98%) was applied to
baseline. Energy consumption in the non-IT sector is the energy derive the total annual data center power consumption of
required to maintain an appropriate operating environment for IT 23,124,000 kWh.
equipment. This includes cooling systems and power distribution Table 12 shows the energy baseline for each system in the
systems. The energy consumption of the component equipment in reference data center. Estimating the PUE using 14,796,000 kWh of
the non-IT sector defined above can be calculated based on the the total annual power used in the IT sector, it is 1.563. The PUE is a
measured data of IT input power. The energy baseline of the condition wherein all equipment of the data center is operated all
reference data center was estimated using the energy data in 2018 year round, and although it is lower than the actual PUE in 2018, it
(Table 11). is not at a high energy-efficiency level. The ultimately derived
From the 2018 actual energy data, the rated power for the IT baseline PUE is meaningful as a quantitatively comparable standard
sector with 3.45 MW was based on 1690 kW with an average load for the energy-saving effect of utilizing a row-based cooling system.
factor of 49%. The rated power for the baseline non-IT facility was
estimated using Equations (5)e(12). The cooling coil capacity of 6.4. Row-based cooling efficiency evaluation
CRAH was 2100 kW with 120% of the IT load applied, and the fan
power corresponding to the total air volume of approximately 6.4.1. Comparison with existing row-based cooling
12,500 m3/min was 173 kW. For the chiller of the central chilled The following question can be raised: is the proposed MAC
water system, a 700 usRT (2415 kW) turbo chiller with 115% of the prototype more efficient than other similar available systems? As
total CRAH coil capacity was selected, and the power consumption mentioned before, for conventional row-based cooling systems,
was set to 413 kW based on the baseline chiller efficiency (0.590 CRAHs are installed between rack servers in the white space of IT
kW/RT). The chilled water pump power of 16.2 kW was applied rooms to efficiently cool the IT equipment. The main drawback of
corresponding to a water flow rate of 3150 L/min and a pump head the existing system is water damage, which has serious effects on

Table 10
Overview of the operation conditions of a reference data center.

Items Descriptions Items Descriptions

Location (Site) Daejeon, Korea Size 10,129 m2 (1BF/5F)


White space area 9508 m2 White apace ratio 93.87% (white space area/total building area)
Number of IT servers 7563 EA Number of racks 2257 racks
Number of IT routers 172 EA Occupied rack space 66.7%
Number of IT storages 86 EA Occupied white space 70.0%
Other IT equipment 701 EA Power density 0.5 kW/m2 (1.53 kW/rack)
Cooling system Room-based cooling with central chilled water system Hot aisle air temperature Min 27.0  C/Max 30.0  C/Set 30.0  C
Cold aisle air temperature Min 23.0  C/Max 27.0  C/Set 25.0  C

Table 11
PUE in the reference data center in 2018.

Items Descriptions Items Descriptions

Year 2018 IT energy consumption 14,796,093 kWh


Method PUE Category1 Total energy consumption 25,880,570 kWh
IT power measuring UPS output PUE 1.75

13
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Table 12
Estimation of the baseline PUE in the reference data center.

Sector System Equipment Energy estimations Energy consumptions [kW]

Rated power Annual energy

IT IT IT equipment Design IT load: 3.45 MW (average load ratio: 49%) 1690 14,796,000
Non-IT Cooling CRAH unit Cooling coil: 1690 kW  (120%) ¼ 2100 kW
Fan Air flow rate: cooling coil (DT ¼ 10  C) (120%) ¼ 12,500 m3/min
Fan energy: (13.8 W/CMM) 12,500  13.8 ¼ 173 kW 173 1,516,000
Chiller Size: cooling coil 2100 kW  (115%) ¼ 2415 kW (700 RT)
Chiller energy: 700 RT  0.590 kW/RT ¼ 413 kW (COP 6.0) 413 3,618,000
Pump-1 Chilled water pump: 4.5 LPM/RT  700 RT ¼ 3150 L/min
Head: 12 mAq þ 9 mmAq/RT  700RT ¼ 12 þ 6.3 ¼ 18.3 mAq
Pump energy: 16.2 kWh (efficiency 70%, S.F. 20%) 16.2 140,000
Pump-2 Condenser water pump: 7.6 LPM/RT  700 RT ¼ 5320 L/min
Head: 14 mAq (fixed)
Pump energy: 20.9 kWh (efficiency 70%, S.F. 20%) 20.9 183,000
Subtotal (1) Cooling system loss (20%) 6,736,000
Power UPS Design IT load: 3.45 MW (average load ratio: 49%)
Operating IT load: 1690 ÷ 0.94 (UPS efficiency) ¼ 1798 kW
Power loss in UPS: 107.9 kW 107.9 945,000
Lighting White space (9508 m2): 5 W/m2 (8 h/day) 139,000
Office space (621 m2): 25 W/m2 (8 h/day) 45,000
Subtotal (2) 184,000
Subtotal (3) 22,661,000
Total energy consumption Power loss in switchgear: 2% (efficiency 98%) 23,124,000
PUE (power usage effectiveness) Total energy (24,018,000 kWh) ÷ IT energy (14,796,000 kWh) 1.563

the IT equipment. Another detail to consider when using low in this case, the primary and secondary chilled water pumps must
temperature chilled water (7  C from chiller plant) is the drainage be similar in size, which would inevitably lead to an increase in the
of the condensate at the cooling coil (exchanger). Water damage cooling energy.
can be minimized by using high temperature chilled water. As Table 13 shows the cooling partial PUE for the existing row-
shown in Fig. 11, the conventional row-based cooling system should based cooling (ALT-1) and new in-row cooling package (ALT-2).
be implemented to include the low temperature chilled water to The two systems are the same row-based cooling systems, with the
the high temperature chilled water primary cycle and high tem- exception of the secondary cycle pump. Thus, the cooling power of
perature chilled water to air secondary cycle, such that the primary ALT-1 that has a secondary chilled water pump is 6.6% larger than
low temperature chilled water pipe does not directly supply water that of ALT-2 with a refrigerant distribution unit.
to the cooling coil of the CRAC/H unit with condensation. However,

Fig. 11. An existing row-based cooling solution with sequential water-to-water-to-air heat exchange system.

Table 13
Comparison of the cooling pPUE between the existing row-based cooling and new in-row cooling package.

Power [kW] ALT-1: existing row-based cooling ALT-2: new in-row cooling package Remark

CRAH (#1) fan 0.1946 0.1946 Measurement results


CRAH (#2) fan 0.1044 0.1044 Measurement results
CRAH (#1) fan 0.1914 0.1914 Measurement results
Secondary chilled water pump 0.7608 e Energy baseline
RDU (#1) pump e 0.6828 Measurement results
IT Power 61.050 61.050 Measurement results
Cooling power 1.2512 1.1732 þ6.6%
pPUECooling 0.0205 0.0192

14
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

Fig. 12. Influence of the row-based cooling on data center energy efficiency.

6.4.2. Cooling efficiency by using PUE  In order to prevent air recirculation and bypass (chronic prob-
Based on the baseline PUE of the reference data center, the lems in data centers), a complete air containment prototype that
energy saving evaluation of the row-based cooling system is the shields both cold and hot aisles is presented for the first time,
final step in this study. According to the in-situ measurement re- minimizing hot air mixing and heat loss, and maximizing
sults and energy baseline, the energy contribution to the row-based airtightness.
cooling system per IT power is 20.49 W/IT-kW for ALT-1 and  Efficient row-based cooling of an independent MAC is realized,
19.22 W/IT-kW for ALT-2. Because the operating conditions of the and an in-row cooling package with multiple heat-transfer
central chilled water system of the reference data center are the medium is developed that is a sequential water to refrigerant
same, even if a row-based cooling system is applied, it has negli- to air heat exchange system, for improving cooling efficiency
gible effect on other systems. Therefore, the energy saving effect and safety.
can be analyzed by changing only the CRAH unit part in the baseline  In order to fundamentally prevent water damage in IT rooms,
energy consumption structure. The row-based cooling system can the in-row cooling package has been developed to include a
increase the performance of the overall equipment by raising the watererefrigerant primary cycle and refrigeranteair secondary
SA temperature more than the room-based cooling system, but in cycle, such that the primary chilled water pipe does not directly
principle, the comparison is made under the same conditions and pass through the IT room. Because the secondary refrigerant
the efficiency of the equipment is not changed. liquid pipe in the white space uses only sensible heat without
Fig. 12 shows the effect of energy consumption when a row- causing any phase change, condensate drain is not necessary.
based cooling system is applied to the reference data center. In  As a result of the rating test of cooling capacity for each
the case of ALT-2, if the same central chilled water system as the component unit, the in-row cooling package provides the
baseline is used and the new row-based cooling system with only output of the cooling capacity corresponding to the assigned IT
mechanical cooling is applied, 21.9% energy of the cooling system heat load, and the refrigerant distribution unit implements the
alone can be saved by fan power conservation of the CRAH units, cooling performance without encountering any problem while
and the PUE is improved to 1.461. ALT-1, the conventional row- cooling the IT equipment.
based cooling solution, can also save cooling energy by 21.5%  As a result of the total cooling performance of the in-row cooling
with a PUE of 1.462. From an energy perspective, the energy effi- package, thermal balance is maintained between the water to
ciency was about the same for both ALT-1 and ALT-2. However, refrigerant primary cycle and the refrigerant to air secondary
compared with ALT-1, ALT-2 must be considered as the better cycle. The row-based cooling system is analyzed, wherein the
cooling system in data center stability terms. Unfortunately, ALT-1 cold aisle in row-A satisfies the thermal guideline by converging
has a low probability but faces a high water damage risk during the air temperature to 23.1  C based on the supply chilled water
operation. In the case of ALT-3, if the new row-based cooling system temperature of 7.9  C under the IT load of 61.1 kW.
with water-side economizer is applied, 43.5% energy of the cooling  The effect of the calculated cooling partial PUE is almost insig-
system alone can be saved further by reducing the operating time of nificant with an average of 0.019 (19.22 W/IT-kW), excluding the
the central chiller plant, and the PUE is improved to 1.361. energy of the central chilled water system including the chiller
Compared with the air-side economizer that cannot be applied to plant and chilled water circulation pump.
data centers already in operation that require additional outdoor  From the viewpoint of PUE that is a data center energy efficiency
air intake system and space, because water-side economizer can be index, the baseline non-IT system model selection has been
applied immediately by installing heat exchangers on the coolant proposed by integrating the criteria identified by the Title 24
side in the existing central chilled water system, a row-based energy code and the energy star program.
cooling system must be applied. The effect of water-side econo-  The data center energy baseline can sequentially review the IT
mizer can be operated for 2930 h per year based on the climate load of the IT system, derive the energy contribution of the
zone of Korea [34,35]. cooling system, and expect non-IT system including the alloca-
tion of power loss.
 As a result of PUE analysis, including the chiller plant to increase
7. Conclusions the usability of the in-row cooling package, a row-based cooling
system, it is found that it is possible to achieve PUE 1.361
Following the previous numerical optimization study, in the through a 43.5% reduction in the cooling system from the
current experimental study, a prototype of an independent MAC is baseline PUE 1.563.
developed that overcomes the limitations of the existing room-  The energy efficiency was approximately the same for both the
based and row-based cooling system. A new in-row cooling pack- new in-row cooling package and the conventional row-based
age system can meet the requirements of energy efficiency without cooling solution. However, the conventional system has low
the risk of water damage. The main findings can be summarized as probability but has a high water damage risk during operation.
follows:
15
J. Cho, B. Park and S. Jang Energy 258 (2022) 124787

The new in-row cooling package can be considered as a superior evaporator for rack-backdoor cooling of data center. Appl Therm Eng
2020;164:114550.
cooling system from the perspective of data center stability.
[11] Kleyman B, Gilooly B. Keynote: state of the data centere 2020 vision. In:
Proceedings of the data center world conference; 2020. 15 april.
As confirmed through this experimental study and our previous [12] Cho J, Kim Y. Development of modular air containment system: thermal
numerical study, the novel in-row cooling package is able to secure performance optimization of row-based cooling for high-density data centers.
Energy 2021;231:120838.
both IT operational safety and economical energy efficiency. [13] Nada SA, Said MA, Rady MA. CFD investigations of data centers' thermal
Therefore, an independent MAC is necessary to expand the appli- performance for different configurations of CRACs units and aisles separation.
cation as a cooling solution for new high-density data centers. Alex Eng J 2016;55(2):959e71.
[14] Nada SA, Abbas AM. Solutions of thermal management problems for terminal
racks of in-row cooling architectures in data centers. Build Environ 2021;201:
Authorship statement 107991.
[15] Abbas AM, Huzayyin AS, Mouneer TA, Nada SA. Effect of data center servers'
power density on the decision of using in-row cooling or perimeter cooling.
Substantial contribution to conception and design-Jinkyun Cho. Alex Eng J 2021;60(4):3855e67.
substantial contribution to acquisition of data-Jinkyun Cho, Beun- [16] Moazamigoodarzi H, Tsai PJ, Pal S, Gjosh S, Puri IK. Influence of cooling ar-
gyong Park, Seungmin Jang. substantial contribution to analysis and chitecture on data center power consumption. Energy 2019;183(15):525e35.
[17] Moazamigoodarzi H, Gupta R, Pal S, Tsai PJ, Ghosh S, Puri IK. Modeling tem-
interpretation of data-Jinkyun Cho, Beungyong Park. drafting the
perature distribution and power consumption in IT server enclosures with
article-Jinkyun Cho. critically revising the article for. important row-based cooling architectures. Appl Energy 2020;261:114355.
intellectual content-Jinkyun Cho, Beungyong Park, Seungmin Jang. [18] Jin C, Bai X, An Y, Ni J, Shen J. Case study regarding the thermal environment
and energy efficiency of raised-floor and row-based cooling. Build Environ
final approval of the version to be published-Jinkyun Cho.
2020;182:107110.
[19] Gupta R, Asgari S, Moazamigoodarzi H, Pal S, Puri IK. Cooling architecture
Declaration of competing interest selection for air-cooled Data Centers by minimizing exergy destruction. En-
ergy 2020;201:117625.
[20] Cho J, Woo J. Development and experimental study of an independent row-
The authors declare that they have no known competing based cooling system for improving thermal performance of a data center.
financial interests or personal relationships that could have Appl Therm Eng 2020;169:114857.
appeared to influence the work reported in this paper. [21] ASHRAE TC 9.9. Thermal guideline for data processing environments. Atlanta,
GA: American Society of Heating Refrigerating and Air-Conditioning Engi-
neers, Inc.; 2015.
Acknowledgement [22] ASHRAE, ANSI/ASHRAE standard 127-2012. Method of testing for rating
computer and data processing room unitary air conditioners. Atlanta, GA:
American Society of Heating, Refrigerating and Air-Conditioning Engineers,
This work was supported by the National Research Foundation Inc.; 2012.
of Korea (NRF) grant funded by the Ministry of Science and ICT [23] AHRI. AHRI standard 1361-2017; performance rating of computer and data
(MSIT) (No. 2022R1F1A1068262). This work was also supported by processing room air conditioners. Arlington, VA: Air-Conditioning Heating
and Refrigeration Institute; 2017.
the Korea Institute of Energy Technology Evaluation and Planning
[24] ISO. ISO/IEC Guide 98-3:2008, uncertainty of measurement - Part 3: guide to
(KETEP) and the Ministry of Trade, Industry and Energy (MOTIE) of the expression of uncertainty in measurement (GUM:1995). Geneva: Inter-
the Republic of Korea (No. 20204030200080). The authors would national Organization for Standardization; 2008.
[25] Avelar V, Azevedo D, French A. PUETM: a comprehensive examination of the
like to thank Mr. Byoung-Nam Choi and Dr. Min-Geon Go from the
metric, the Green Grid white paper #49. The Green Grid Technical Commit-
SAMWHAACE Co., Ltd. for development of the in-row cooling tee; 2012.
package with sequentially water to refrigerant to air heat exchange [26] Federal Energy Management Program. M&V Guidelines: measurement and
system in this study. verification for performance-based contracts version 4.0. U.S Department of
Energy; 2015.
[27] Díaz AJ, Ca ceres R, Torres R, Cardemil JM, Silva-Llanca L. Effect of climate
References conditions on the thermodynamic performance of a data center cooling sys-
tem under water-side economization. Energy Build 2020;208:109634.
[1] Masanet E, Shehabi A, Lei N, Smith S, Koomey J. Recalibrating global data [28] Cho J. An analysis of the data center energy consumption structure for effi-
center energy-use estimates. Science 2020;367(6481):984e6. cient energy utilization. J Architect Inst Korea 2021;37(8):153e64.
[2] Koot M, Wijnhoven F. Usage impact on data center electricity needs: a system [29] The California Energy Commission. Title 24; building energy efficiency stan-
dynamic forecasting model. Appl Energy 2021;291:116798. dards. state of california; 2016.
[3] Cisco global cloud index: forecast and methodology 2016-2021, Cisco white [30] Stein J, Gill B. PG&E data center baseline and measurement and verification
paper 1513879861264127. Cisco and/or its affiliates; 2018. (M&V) guidelines. San Francisco, CA: Pacific Gas and Electric Company; 2016.
[4] O'Connell A. Forecast: data centers worldwide 2015-2022. Gartner Research [31] ENERGY STAR Program Requirements for Uninterruptible Power Supplies
G00360430. Gartner, Inc. and/or its affiliates; 2018. (UPSs) Eligibility criteria (Rev. Jul-2012).
[5] Beaty DL. Internal IT Load profile variability. ASHRAE J 2013;55(2):72e4. [32] Zhang Q, Shi W. Energy-efficient workload placement in enterprise data-
[6] Cho J. Comparing thermal performance for data center cooling. ASHRAE J centers. Computer 2016;49(2):46e52.
2021;63(8):44e56. [33] AL-Hazemi F, Peng Y, Youn CH, Lorincz J, Li C, Song G, Boutaba R. Dynamic
[7] Dunlap K, Rasmussen N. Choosing between room, row, and rack-based cooling allocation of power delivery paths in consolidated data centers based on
for data centers, APC white paper 130 rev02. Schneider Electric-Data Center adaptive UPS switching. Comput Network 2018;144:254e70.
Science Center; 2012. [34] Cho J, Lim T, Kim BS. Viability of data center cooling systems for energy ef-
[8] Rake R, Baul S. Data center cooling market outlook: 2025. Allied Market ficiency in temperate or subtropical regions: case study. Energy Build
Research; 2018. 2012;55(12):189e97.
[9] Abbas AM, Huzayyin AS, Mouneer TA, Nada SA. Thermal management and [35] Cho J, Kim Y. Improving energy efficiency of dedicated cooling system and its
performance enhancement of data centers architectures using aligned/stag- contribution towards meeting an energy-optimized data center. Appl Energy
gered in-row cooling arrangements. Case Stud Therm Eng 2021;24:100884. 2016;165:967e82.
[10] Zhan B, Shao S, Zhang H, Tian C. Simulation on vertical microchannel

16

You might also like