Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Renewable and Sustainable Energy Reviews 142 (2021) 110842

Contents lists available at ScienceDirect

Renewable and Sustainable Energy Reviews


journal homepage: http://www.elsevier.com/locate/rser

Benchmarking the practice of validation and uncertainty analysis of


building energy models
K.E. Anders Ohlsson *, Thomas Olofsson
Department of Applied Physics and Electronics, Umeå University, Umeå, Sweden

A R T I C L E I N F O A B S T R A C T

Keywords: The practice of validation and uncertainty analysis (UA) of building energy models (BEM) is critically reviewed.
Model validation The background for this review is the recognized need for improvement of the accuracy in prediction of building
Uncertainty analysis energy performance, e.g. as part of an efficient mitigation response to climate change. The review was performed
Building energy performance
using benchmark comparison to verification and validation (V&V) frameworks obtained from the field of sci­
Building information modelling
Validation experiment
entific computing. First, the current practice of V&V of BEM was reviewed, with a special focus on UA, and on the
Building performance gap existing validation experiments (VE), used to provide the measurement data required for validation of BEM.
Second, the review included a case study on the V&V of the European and International standard BEMs, CEN ISO
13790 and 52016–1, for calculation of the hourly energy use for space heating and cooling. From the perspective
of the benchmark V&V frameworks, the conclusion was that these standard models cannot be considered as
validated. Based on the present review, suggestions are given on how to strengthen the Building Information
Modelling (BIM) initiative in its support for the development of accurate BEM. Finally, scientific challenges in
terms of V&V of BEM are identified, where the most important is to increase the degree of consensus among
scientists on the procedure for V&V as a condition for creating scientifically based standard BEMs.

It was against this background that the EU Directive 2010/31/EU


was adopted with the purpose of promoting the improvement of the BEP
1. Introduction within the European Union [4]. The directive includes the application of
minimum requirements to the BEP of new and existing buildings, and
1.1. Building energy performance (BEP) gap also of parts of the building envelope subject to retrofitting or replace­
ment. EU member states shall apply a methodology for calculating the
The IPCC report Climate Change 2014 stated that there had been an energy performance of buildings, and in this take into account relevant
observed global warming between the years 1951 and 2010 [1]. It European (CEN) standards. The methodology shall include consider­
concluded that it is extremely likely that anthropogenic greenhouse gas ation of the following actual building thermal characteristics: thermal
(GHG) emissions, together with other anthropogenic drivers, have been capacity, insulation, and thermal bridges [4].
the dominant cause for this warming. The major part of anthropogenic Concurrently with increasing demands for energy efficient buildings,
GHG consists of carbon dioxide, mainly derived from the combustion of sometimes a substantial gap between calculated and measured BEP has
fossil fuels as an energy source. For this reason, energy efficiency and been recognized [5–7]. In a recent review by de Wilde [5], the possible
savings are regarded as one important mitigation response for reducing causes of the BEP gap were identified. For examples, the thermal models
GHG emissions [1]. could be inaccurate, weather models used in the simulation may not
Worldwide, the building sector was estimated to use on the average correspond to real on-site weather conditions, and occupancy patterns
24% of the supplied energy, while this fraction was 37% for EU and 40% and HVAC operation performance may differ from the assumptions
for USA, by the year 2004 [2]. For buildings in the EU and USA, HVAC made at the design stage.
uses around 60% of supplied energy in the residential sector, and
approximately 50% for the commercial sector [2]. For the residential
sector, the forecast is that the global energy demand for air-conditioning
will increase by 72% by the year 2100 due to climate change [3].

* Corresponding author. Department of Applied Physics and Electronics, Umeå University, S-90187, Umeå, Sweden.
E-mail address: anders.ohlsson@umu.se (K.E.A. Ohlsson).

https://doi.org/10.1016/j.rser.2021.110842
Received 18 February 2020; Received in revised form 15 December 2020; Accepted 15 February 2021
Available online 4 March 2021
1364-0321/© 2021 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

Abbreviations: MCA Monte Carlo Analysis


OO Object-Oriented
AgV Aggregated Variable OTC Outdoor Test Cell
BBM Black-Box Model PBA Probability Bounds Analysis
BEM Building Energy Model/Modelling PCA Principal Component Analysis
BEP Building Energy Performance QA Quality Assurance
BIM Building Information Modelling QOI Quantity of Interest
C Thermal Capacity SA Sensitivity Analysis
CEN European Committee for Standardization SC Scientific Community
CS Coleman and Stern R Thermal Resistance
CV Cross-Validation Index ResA Residual Analysis
DP Double pane TC Technical Committee
DSA Differential Sensitivity Analysis UA Uncertainty Analysis
FSTB Full Scale Test Building VD Validation Domain
GBM Grey-Box Model VE Validation Experiment
GHG Green-House Gas V&V Verification and Validation
IFC Industry Foundation Classes WBM White-Box Model

1.2. Building Information Modelling (BIM) for building energy models accuracy in the predictions. In model validation, the comparison of
(BEMs) simulated and measured values, requires that uncertainties in these data
have been estimated and analyzed. UA includes the estimation of
Tuohy and Murphy [6] reviewed current initiatives to improve the simulation and measurement uncertainties, and the identification and
BEP and to reduce the gap between predicted and actual performance. quantification of their uncertainty components. For example, one source
They especially emphasized the Building Information Modelling (BIM) of uncertainty in the model simulation result could be erroneous or too
and management initiatives. In general, the BIM concept facilitates the simplified model equations. Errors in model input variables and errors
exchange of information between software and partners within an in­ originating from the numerical solving of model equations may also
dustrial sector. It includes a digital representation of the building cause uncertainty in the simulation result.
structure (or more generally, the product structure), e.g. the 3D building The BIM vision is that builders should have a common standardized
geometry and information on thermal properties of building materials, language for digital representation of the building and exchange of in­
to be used throughout the building life-cycle [8–10]. Initially, BIM was formation between partners and software. This aims at facilitating cost-
mainly applied at the design and construction stages of new buildings, effective and high-quality production of buildings, which is a highly
but is now increasingly used for existing buildings, in maintenance, complex process where building systems are designed, constructed and
refurbishment, and de-construction [11–13]. The BIM concept (product operated by several different organizations in cooperation. For BIM 3D
information modelling) has previously been successfully applied in geometry models, with accompanying information on building ele­
other industries, e.g. the electronics and automotive industries, to ments, a common language has been realized by standardization of the
enhance productivity and competitiveness [6]. Key factors of the success data format for 3D geometry (Industry Foundation Classes (IFC); ISO
of these BIM benchmark industries have been the customer and media 16739: 2013; [14]).
focus on actual performance, and the quality assurance (QA) systems There are several important reasons to apply the BIM vision for
approach to the manufacturing process. Presently, with regards to BEP, standardization of language and procedures to support the development
the building industry main focus is on predicted performance, not actual of accurate building energy model (BEM) simulations, for examples: (i)
performance, which could be one important reason to why the observed Contribute to increased energy efficiency as mitigation response for
performance gap has not yet been adequately addressed [6]. reducing GHG emissions; (ii) Support validation and UA of BEM; (iii)
Rees [7] pointed out that currently used building energy models Support regulatory compliance evaluation; (iv) Support QA and mar­
were developed in the 1980s, and some simplifications of the model keting of high BEP; (v) Support and increase speed of BEP design process
form, made at that time, still persist. For example, heat conduction (cf. review by Refs. [15,16]); (vi) Support decisions on the economy of
through the building envelope is often modeled assuming heat transfer different design or operation alternatives for BEP (e.g. Ref. [17]).
to be a one-dimensional (1D) process, while 2D and 3D heat transfer is Although BEM, and BEM based on BIM, have been previously reviewed
neglected. The reason for making this assumption was probably the [15,18–23], there is still a need for a critical review on the validation of
limited computational capacity available 30–40 years ago, which BEM. The present work will provide this, using, as in Ref. [6], the
restricted the complexity of the model to be simulated both in terms of benchmarking approach.
dimensionality and spatial detail. Today, with high performance
computing and high resolution 3D geometry, BIM models should allow 1.3. Benchmarking approach
the further development of detailed 3D building energy models, which
more accurately predict the BEP of real buildings [7]. Benchmarking usually involves the comparison of an industrial
Based on the aforementioned reasons, the successful implementation production process against the perceived best practice within the same
of the BIM concept for improved BEP is strongly connected to the ac­ or similar industrial sector. For the purpose of the present work, the
curacy of BEM predictions. Accuracy in model predictions is achieved by benchmark reference practice of validation and UA for fluid mechanics
model validation. Validation is the process where it is determined how and heat transfer models was selected from the field of scientific
well the model prediction of the energy performance Quantity of In­ computing. The reasons for this choice were that this benchmark prac­
terest (QOI) agrees with real world measurements of the same quantity tice has several important advantages, as described in section 2, and that
acquired in a Validation Experiment (VE). Validation is always per­ it has been applied to modelling of several high-risk processes, within e.
formed from the perspective of the intended use of the model. For g. the ship building, aerospace and nuclear industries, where accuracy in
instance, different usages might imply different levels of required model predictions is of importance for maintaining safety.

2
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

In this work, there were two different approaches to benchmarking to be clearly defined from the beginning: Verification and Validation
comparisons: First, a review of the validation practice for BEM, espe­ (V&V). These terms will here be defined as within the fields of solid
cially including the existing VEs, provided the comparison in a broad mechanics (ASME, 2006; [24]) and computational fluid dynamics and
perspective. Second, a case study was performed to give a more detailed heat transfer (AIAA, 1998, [25]; ASME, 2009; [26]) (compiled from
perspective. For this case study, two European and International stan­ Refs. [27–29]):
dard BEM models were selected because of their conceptual and math­
ematical simplicity, and because they formally fulfill the requirements • Verification: The process of determining that the numerical solu­
of the EU Directive 2010/31/EU, and therefore their use potentially may tions to model equations are correct.
become mandatory within some EU member states. • Validation: The process of determining the degree to which model
equations accurately represent the real world, described by
measured data, from the perspective of the intended use of the
1.4. Objectives model.

The main objective of the present work was to critically review the The latter definition implies that accuracy requirements in the vali­
state of the art of validation and UA of BEM, using benchmarking dation process are determined by the intended use of the simulation QOI
comparisons: (1) Review of the practice of validation of BEM, including result. The evaluation of accuracy is based on a comparison between
a comprehensive review on existing VEs. (2) Case study, which involves simulated and experimental data, and on UA. With a change of the
the hourly BEM models of the CEN ISO standards 13790 and 52016–1, intended use, the accuracy requirements on the value of the simulated
both for calculation of energy needs for heating or cooling of building QOI may change, and therefore renewed model validation might be
spaces. necessary.
Based on these benchmarking comparisons, the intention was to In the following two subsections, two different benchmark V&V
identify important challenges with regards to validation and UA for frameworks are presented: (1) the CS-V&V framework by Coleman and
BEM, and to suggest how the BIM initiative could be extended to Stern (CS) (ship hydrodynamics research, 1997, [28,29]; ASME, 2009,
strengthen its support for the development of accurate BEMs. [26]), and (2) the Sandia-V&V framework by Oberkampf et al. at Sandia
National Laboratories (nuclear energy and nuclear weapons research,
2. Benchmarking frameworks 2004–2011, [27,30–33]).

In scientific computing, there are two fundamental terms that need

Fig. 1. Verification and validation of model of physical object (exemplified by a building), where QOI is the quantity of interest, determined either as S, the
simulation result, or as D, the measurement result. Results from validation experiment (lower part): E is the validation comparison error (filled dots). The vertical
bars show the confidence interval ±k⋅uval, where uval is the validation uncertainty of the validation experiment, and k is the coverage factor. Case 1: |E|≫k⋅uval, Case
2: |E|≤k⋅uval.

3
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

2.1. The CS-V&V framework Case 2. If |E| ≤ k⋅uval then probably δmodel ≲δval ,
In Case 1, the model is significantly in error and we obtain an esti­
The CS-V&V framework applied to a model of a physical object is mate of the magnitude and sign of this error, i.e. δmodel ≈ E. However, in
illustrated in Fig. 1. Validation requires that the simulated QOI, obtained Case 2, the result tells us that δmodel is small compared to other errors, but
by numerical solving of the model equations, is compared to the QOI also it tells that the quality of the VE is insufficient for evaluation of
measured on the real physical object in a validation experiment (VE). δmodel since uval is too large [35]. We note that, in both Case 1 and 2, the
The mean values and uncertainties of the model input variables during model might still be considered as validated for its intended purpose, if
the VE should be measured or otherwise estimated. In the verification the required maximum value of δS is > E in case 1 and > k⋅uval in Case 2.
step, the numerical uncertainty due to solving the model equations is The validation has failed if this validation requirement is not fulfilled.
estimated. In the validation step, the model solution (S) for QOI, is Then, there are two ways to proceed.
compared with the measured data (D) for QOI, using the validation First, assume, as in Case 1 (see Fig. 1), that there is a significant
metric E. In the CS-V&V framework [28,29], E is the validation com­ model error δmodel ≈ E, and that validation has failed because this error is
parison error, defined as: larger than what is required for the intended use of the model. Then
there are two options for the continuation of the validation process:
E=S − D (1)
Assuming that T is the true value of the QOI, the error in the solution 1. Based on an understanding of the physics behind the model, the
value is the difference between S and T, model equations are changed with the purpose to reduce δmodel , and a
new VE is performed and evaluated. If all necessary model inputs for
δS = S − T (2)
the revised model was measured in the previous VE, then these data
can be reused.
and the error in the measured value is given by
2. “Calibration” of the model, which is the process where the values of
δD = D − T (3) model parameters, i.e. model inputs, are changed, so that model
predictions make a better fit with existing experimental data for the
Rearrangement of Eqs. (1)–(3) yields
QOI. This approach is controversial since: (i) It does not improve the
E = S − D = δS − δD (4) mathematical structure of the model; (ii) There is often a multitude
of model parameter combinations that fit equally well to existing
which shows that E is a combination of all errors in simulation and data, and it is unclear which combination to choose; (iii) The new set
measurement. The simulation error δs could be expressed as: of parameter values may not correspond to physically realistic
δS = δ + δnum + δinput (5) values, and therefore the model may lose its capacity to support
model understanding of the physics behind the model; (iv) A new set of
where δmodel is the error due to selection of model equations, δnum is experimental data is needed for validation of the calibrated model,
the error caused by numerical solution of model equations, and δinput is since data from the original VE was used in the revision of the
the error in the simulated QOI result due to errors in model input pa­ original model, and are thereby incorporated in the revised model.
rameters, e.g. property values and boundary conditions. However, for some intended use of the model predictions, model
By inserting the solution error components into Eq. (4) and rear­ calibration might be justified.
ranging, we can estimate δmodel by
( ) Second, assume instead, as in Case 2 (see Fig. 1) that the model error
δmodel = E − δnum + δinput − δD (6) could not be estimated since it is not discernible due to the large value of
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
The error combination δval = (δnum +δinput − δD ) has a distribution uval (= u2num + u2input + u2D ) obtained for this VE, and assume that vali­
whose standard deviation equals the validation uncertainty uval defined dation has failed. In this case the validation process may proceed after
as: reduction of the major uval component(s):
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
uval = u2num + u2input + u2D (7)
1. Reduction of unum by improving the numerical methods used to
calculate the solution.
where unum, uinput, and uD are the standard uncertainties of the distri­ 2. Reduction of uinput by performing a new VE, where the effects of
butions of the corresponding errors. In the CS-V&V framework, the uD is model input uncertainties are smaller. This usually requires a
estimated in agreement with the ISO standard “Guide to the Expression sensitivity analysis (SA) to identify and quantify the major individual
of Uncertainty in Measurement” (GUM; [34]). The uinput is estimated by effects on the QOI from the uncertainty in each of the model input
the propagation of the input uncertainties through the model to the variables. Input variables that have major effects on the QOI should
simulated QOI value. The estimate of the numerical uncertainty is given be more accurately measured in the new VE. For examples: (i) Oc­
by cupancy influences the energy balance of a building in several ways,
√̅̅̅̅̅̅̅̅̅̅̅̅̅
∑ and uinput may be reduced by more accurate measurement of e.g.
unum = u2n,i (8)
i
number of occupants over time and their release of metabolic heat;
(ii) Outdoor climate is another important model input variable,
where the uncertainty components un,i derive from various numerical where e.g. measurement data on the local building climate may
error sources, as for examples the iterative solution error, and errors due reduce uinput instead of using data from regional weather models.
to spatial and temporal discretization [29]. 3. Reduction of uD by reducing the measurement uncertainty in mea­
Finally, the validation results are interpreted ([28], section 6-7; [35]) surement of the QOI, possibly by using a more accurate measurement
to mean that E is an estimate of δmodel and uval is the standard deviation of methodology, and then performing a new VE.
the distribution of δmodel. Therefore δmodel falls within the interval E ±
k⋅uval with a high degree of confidence, where k is the coverage factor If successful, some of these alternatives, for Case 2, achieves the
(for normal, triangular, or uniform distributions, k is between 2 and 3). required reduction of uval down to levels, where the outcome of the
The validation results are interpreted as follows [28,29,35]: (possibly new) VE is given by Case 1.
In summary, the procedure for evaluation of a VE within the CS-V&V
Case 1. If|E|≫k⋅uval then δmodel is estimated to approximately equal E
framework consists of the following steps:

4
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

1. Estimation of the numerical uncertainty unum in the model verifica­


tion stage
2. Estimation of the measurement uncertainty uD in the VE
3 Estimation of the model input uncertainty uinput in the VE
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
4 Calculation of the validation uncertainty uval = u2num + u2input + u2D
5. The value of QOI is simulated (S) and measured (D) for the condi­
tions of the VE
6. Calculation of the validation comparison errorE = S − D
7. Comparison of |E| against k⋅uval

2.2. The Sandia-V&V framework

The Sandia-V&V framework differs from the CS-V&V framework in


mainly the following aspects [30,31]: Fig. 2. Validation domain (grey area), spanned by existing validation experi­
ments (crosses), for model with two model input variables X1 and X2. Points of
• Validation consists of three stages: (1) Performance of VE and eval­ application of the model for prediction are shown inside (filled circle) or
uation using the validation metrics. (2) Prediction in the application outside (open circle) the validation domain.
domain where no experimental data exist. (3) The decision on the
adequacy of the model for its intended use. structure to the point of application. This is illustrated in Fig. 2, where,
• Non-deterministic simulations, where predictions are probability for simplicity, the model has only two input variables. The prediction of
distribution functions. The rationale for this is that deterministic QOI, by definition, is performed at application points where experi­
simulation, where mean input values are used, neglect much infor­ mental data is not available. For this reason, there is a prediction error in
mation inherent in the probability distributions. This might be of the simulated result at the point of application, which could be esti­
importance for the intended use of the model. With predictions mated using the model of the error structure for the VD [31].
expressed as a probability distribution, the validation metric has to The concept of a VD is useful both at the validation and application
be defined differently compared to Eq. (1) [31,33]. stages. The modeler could evaluate the distance between the application
• A distinction is made between aleatory (variable; random) uncer­ point and the existing VEs, and thereby estimate the degree of trust in
tainty and epistemic uncertainty (caused by lack of knowledge; the model predictions obtained. If VD needs to be extended to allow the
systematic) [30,31,36]. The propagation of these two kinds of un­ inclusion of intended application points, then the description of the VD,
certainties through the model differ, since the aleatory uncertainties e.g. as in Fig. 2, could give an important indication of where, in the
are expressed using probability distributions, while the epistemic domain geometry, additional VEs should be positioned.
uncertainties are described using interval-valued quantities and
probability bounds analysis (PBA). For example, it has been pro­ 3. Practice of validation of BEM
posed [30] that numerical errors are non-random, but with unknown
∑⃒ ⃒
sign, and unum = ⃒un,i ⃒therefore the combined numerical uncer­ In this section, the practice of V&V for BEM is critically reviewed in
i the perspective of the benchmark CS- and Sandia-V&V frameworks. The
tainty should be estimated as (instead of using Eq. (8)) focus was on BEM simulations where QOI equals the uncontrolled in­
• The validation domain is defined in the model parameter space by door temperature, or the energy need for heating or cooling the building
the series of existing VEs. Each VE is represented by one point in to maintain a constant preselected indoor temperature. Especially, the
model parameter space, and the UA yields the error structure for that review concerns the use of UA, and the concepts of VD and prediction in
point, i.e. values for E, uval, and its components. The error structure of the application domain.
the validation domain could then be modeled, for example by For BEM modelling, dedicated software packages are commonly
regression fit of uncertainty estimates across one model parameter used. Such BEM software, e.g. EnergyPlus, ESP-r, TRNSYS, or IDA ICE,
interval spanned by the VEs. usually consists of a set of component sub-models, combined and
• The concept of prediction is defined as [30,31]: Prediction is the use configured by the modeler into a BEM, tailored to model a particular
of a model to foretell the state of a system under conditions for which physical building for intended use of the model predictions. For
the model has not been validated, i.e. where no experimental data is example, the BEM sub-models may model different physics, e.g. con­
available. The uncertainty in the prediction, upred, is based on the duction and radiation heat transfer, for different building components,
model of the error structure established for the validation domain. If e.g. walls, windows, HVAC equipment, or climate variables, e.g. solar
the point of model application is located outside the validation radiation [37]. The performance of the BEM therefore depends on the
domain then upred is estimated by extrapolation of the error structure performance of its constituent sub-models, and their configuration, but
model (see Fig. 2). also on how sub-models are connected and how data is exchanged be­
tween sub-models.
Roache [35] discussed some current issues in computational V&V, It is often claimed that the BEM software has been “validated”
and also compared the Sandia-V&V framework to the CS-V&V frame­ [38–41]. However, in the perspective of the CS- and Sandia-V&V
work, as standardized by ASME in 2009 [26]. The main conclusions benchmark frameworks, such a claim is misleading for two reasons:
were that: (i) Due to the complexity of PBA, it is defensible to use the Firstly, BEM software could not be validated once and for all possible
Sandia V&V framework mostly in high-risk applications, where low buildings. The validation could only be performed for the specific
probability safety-critical events must be avoided. (ii) The weakest link building of the VE, and for the experimental conditions therein. Sec­
in V&V is the widespread problem of inadequate measurements and ondly, within the building sector, the terminology for “validation”
documentation of experimental conditions in VE. traditionally includes three different activities: (i) “Analytical valida­
We believe that the most important contribution by the Sandia V&V tion”, which involves a comparison of solutions of analytical test cases, i.
framework to BEM simulations is the concept of the validation domain e. artificially constructed models with accurately known solutions. (ii)
(VD), and that prediction involves the extrapolation of the error

5
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

“Comparative testing”, where simulations using several different BEM [44].


software are applied to the same artificial test case, where one software Comparative testing of BEM models usually consists of modelling
is regarded as a reference, which is supposed to deliver solutions of ordered series of constructed test cases. The test series starts with a
superior accuracy. (iii) “Empirical validation”, where model solutions simple reference case. It is followed by cases of increasing complexity,
are compared to experimental data. Within the benchmark V&V obtained by adding new features to the reference case. Often several
frameworks of Sandia and CS, as well as in the V&V standards of ASME different BEM software are tested simultaneously. Then the comparison
and AIAA [24–26], the activities (i) and (ii) should be classified as is between software, which presumably use different numerical algo­
verification activities, although they rarely deliver estimates of unum. rithms and different codes, and where model inputs are provided by
The method used for literature search is based on the fact that each different individual modelers. The comparison can also be between
individual scientific journal article is connected to a number, at least different algorithms or modelling approaches within the same BEM
one, of previous scientific articles, by science citations. No scientific software, with model input kept constant.
article, as far as we know, stands by itself, without reference to other The EU Directive 2010/31/EU on BEP [4] requires that any relevant
scientific works. For searching the scientific literature on a specific CEN standard should be taken into account. In the following, we will
subject, for example “validation of BEM”, we therefore started with a therefore critically examine the CEN standard EN 15265, from the year
few known articles on this subject, and then climbed the network of 2007, for “validation” of BEM for prediction of energy needs for space
articles connected by science citations, backward and forward in time, heating and cooling. The EN 15265 standard relies solely on compara­
searching for new articles relevant to the search subject. This process tive testing using a series of constructed computational model test cases.
was continued until it ceased to produce new search results. Within a The EN 15265 test cases involve the thermal modelling of a box-shaped
scientific domain, e.g. building physics, we believe this method should single room building, with a large window installed into the 10 m2 front
yield a rather complete overview of the scientific literature on a specific side wall, which is directed westwards (see Fig. 3). The 7.0 m2 window
topic. We noted, however, that on the subject of V&V of computer has double pane (DP) glazing, with or without an external shading de­
models, there was very little connection between building physics, and vice. The front wall is exposed to the external climate of Paris (Trappes)
other scientific domains, such as those associated with the Sandia- and in France, with hourly weather data given for one year. Table 2 sum­
CS-V&V frameworks. marizes the 12 test cases for EN 15265. In the tests 1–8 (Table 2), the
ceiling, the floor, and all walls, except the front wall, are adiabatic, i.e.
there is no heat flux across their surfaces, while in tests 9–12 there is an
3.1. Verification of numerical solution external non-adiabatic roof present. The ventilation rate is 1 air change
per hour during office hours (8 h–18 h, Monday to Friday), with inlet air
Table 1 summarizes the literature search on important verification temperature continuously controlled all days of the week.
tests for BEM models. Two of these form the basis for standards on In tests 5–12 (see Table 2) intermittent heating and cooling are
“validation” (=verification, using the present definition) procedures: (i) performed in order to maintain the indoor air temperature between the
the American ASHRAE standard 140 [42], and (ii) the European stan­ set points at 20 ◦ C and 26 ◦ C, respectively. Annual heating and cooling
dard EN 15265 issued by CEN [43]. The works, given in Table 1, make a energy needs (= QOI) are evaluated as relative differences against
distinction, as mentioned above, between analytical validation (here reference results given in the standard. Level A, B, and C accuracy re­
called: analytical testing) and comparative testing: quires less than 5%, 10%, and 15% difference, respectively. The stan­
Analytical testing of BEM models involves the comparison of model dard specifies the composition and thermo-physical properties of the
solutions to exact and known mathematical solutions to test cases [38, building envelope elements. For the glazing system, surface and cavity,
44,45]. Exact analytical solutions are available for simple cases, i.e. the resistances are given, together with values of the thermal trans­
where geometry is one-dimensional and only one single mechanism of mittance and solar energy transmittance. Thermal bridges are neglected.
heat transfer is involved. This holds for heat conduction through a plane In the perspective of the benchmark Sandia- and CS V&V frame­
homogeneous slab wall, where the steady-state solution as well as the works, the examination of the EN 15265 could be summarized by the
dynamic solution, caused by cyclic excitation of its surface temperature, following critical viewpoints:
could be expressed exactly using mathematical elementary functions
1. The term “validation” is used, without an explicit definition. The
Table 1 “validation” procedure seems to qualify more as a verification pro­
Verification tests for building energy models. cedure, as defined within the benchmark CS- and Sandia-V&V
Source Year Type QOI Comments frameworks. The reason is that it involves the comparison of solu­
tions against “truth” solutions to the test cases, calculated using a
NREL [38] 1983 A T, U Heating of plane slab/Parameters:
window, solar radiation, infiltration, software with presumably superior accuracy, but without disclosing
thermal mass
BRE [44] 1992 A T Heating of plane slab/Steady-state and
dynamic tests
IEA/NREL 1995, C T, IEA BESTEST series of test cases,
[42,46] 2013 HC included in ASHRAE standard 140.
Single room building with windows
ASHRAE 2002 A* T, 16 test cases/Cubic single room
[45] HC building with window
EN 15265 2007 C HC 4 initial tests + 8 validation tests/Single
[43] room building with window

Sources: National Renewable Energy Laboratory (NREL) of the U.S. Department


of Energy, Building Research Establishment (BRE; UK), International Energy
Agency (IEA), American Society of Heating Refrigerating and Air-Conditioning
Engineers (ASHRAE), and European Standard (EN; CEN, Brussels). Type of
test: A is analytical, and C is comparative testing. Quantities of interest (QOI) are
the model outputs: T is the temperature, U is the thermal transmittance, and HC
is the energy need for space heating or cooling to a constant temperature.(*)
Contains also non-analytical (empirical) model components, i.e. for convection Fig. 3. Building construction used in the EN 15265 test suite. The building
coefficients and diffuse solar radiation. front side has a window (grey area) inserted.

6
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

Table 2 interface this may lead to numerical errors, which may propagate
EN 15265 test cases. through the model hierarchy to increase model predictive uncertainty
Test Heating Glazing Internal gain External roof Thermal mass [47]. For this reason, spatial alignment and temporal synchronization
− 2 are important issues. The coupling between sub-models could be either
(Wm ) Ceiling Floor
unidirectional or bidirectional. With bidirectional coupling, the model
1 0 Sh DP 20 No H H solution is obtained by a convergence of the iterative solution process.
2 0 Sh DP 20 No L L
3 0 Sh DP 0 No H H
The presence of systematic bias and uncertainties in model output may
4 0 DP 20 No H H lead to failure of the iterative process to converge, or to convergence to
5 HC Sh DP 20 No H H inaccurate solutions [47]. To account for this kind of numerical issue,
6 HC Sh DP 20 No L L there must be a distinction made in the treatment of systematic and
7 HC Sh DP 0 No H H
random uncertainties, as in the Sandia V&V framework (although their
8 HC DP 20 No H H
9 HC Sh DP 20 Yes R H terms are epistemic and aleatory uncertainties, respectively) [30,31].
10 HC Sh DP 20 Yes R L In the recent two decades, object-oriented (OO) programming,
11 HC Sh DP 0 Yes R H where data is structured as interconnected objects and combined with
12 HC DP 20 Yes R H equation-based general purpose solvers, has been applied to BEM, for
HC is heating and cooling with unlimited power. Sh is shading device and DP is examples in the IDA ICE and Modelica software [23,49,51,52]. The
double pane. H and L are high and low thermal mass, respectively. R is a non- separation of the model data structure from the numerical solver, pro­
adiabatic roof. Initial tests 1–4 are informative, while tests 5–12 are normative. vided by the OO approach, can facilitate: (i) Flexible development of the
model, e.g. by easily adding objects to the data structure, or changing
the identity of this software. From Strachan et al. [41], it seems likely the data topology (how objects are interconnected). (ii) Application of
that the “true” solutions were obtained using the ESP-r software. domain-independent solvers to the whole BEM system [49]. The OO
2. According to the benchmark frameworks, validation is always per­ data structure for buildings forms the basis of current BIM software
formed from the perspective of the intended use of the model. In EN (mainly for building geometry), defined in the IFC standard [14]. For
15265, the testing objectives are not explicitly given. this reason, the coupling of BIM geometry data to OO-based BEM soft­
3. The sequence of 12 test cases is described, using 6 model parameters, ware is more convenient compared to coupling to traditional BEM
each varied between two different values (cf. Table 2). The reasons software [23,52].
for selecting this design are not provided. Within the Sandia V&V Regardless of the selection of data structure and solver(s), the BEM
framework, it would probably have been more suitable to select a must pass verification tests in order to provide evidence that the model
design that supports modelling of the error structure across the VD, e. equations are correctly solved. The verification tests shown in Table 1
g. a fractional factorial design. were developed using traditional BEM software, where the BEM data
4. The sequence of test cases implicitly defines the VD, expressed in structure and model inputs haves to be transferred by the modeler from
terms of 6 model parameters, and their ranges of variation. However, the test specification (e.g. from EN 15265) to the BEM software.
EN 15265 does not give the reasons for this definition of the VD, Possibly, the verification test could be specified using the IFC standard
which reduces the confidence in applying the model outside the data format for the building model and its inputs. This would allow
domain borders. testing of various solvers, and solving strategies, starting from exactly
5. The accuracy requirements for the solutions range from 5% up to the same building model. As far as we have seen, this possibility has not
15%, depending upon the accuracy class. Even the lower limit of this yet been explored for verification of BEMs.
range (5%) seems to be a weak requirement if only the numerical
accuracy is included in the accuracy estimate. Ideally, one wants the 3.2. Comparison to validation experiments
numerical error to be negligible in comparison to other error sources,
e.g. experimental error, and therefore the target level for the nu­ In this subsection, the VEs used for validation of BEM were compiled
merical error is often set at below 1%, perhaps even at 0.1% [29,30]. in Table 3, and then compared against the benchmark CS- and Sandia
One reason to allowing 5%–15% accuracy might be that model input V&V frameworks.
and programming errors, by human mistakes, are included. The VEs included in Table 3 were selected according to the following
6. In EN 15265 it is not clear what kind of errors are included in the criteria: (i) VE performed within the past 30 years; (ii) BEM was white-
accuracy requirements. Within the benchmark V&V frameworks, the box model (WBM), or grey-box model (GBM), both based on physics
numerical errors, e.g. the iterative and discretization errors, are theory, but where the GBM has its parameters (often RC-network) esti­
identified and quantified [29,30]. mated from experimental data by calibration [18–23,53]. This excludes
black-box models (BBM), which are empirical models based solely on
Computational physics models in general, and BEM in particular, are experimental data time-series; (iii) The QOI was either the building in­
traditionally composed of a hierarchical system of sub-models operating door temperature (Ti), varying in response to outdoor climate conditions
on multiple physics and with different levels of detail (scales) [47–50]. (free-running system), the energy need for heating or cooling (HC) of
For example, the BEM envelope model may contain modules for walls, indoor space to a preset level of temperature, the cooling power (P), or
roof and windows, and modules for handling different physics, i.e. RC-parameters for GBM; (iv) VE was performed outdoors, using either
conduction or radiation heat transfer. For traditional BEM software (e.g. outdoor test cells (OTC) or full-scale test buildings (FSTB). The OTC
ESP-r, [48]), each sub-model contains its own data structure, model usually consists of a small (ca 10 m2) single room building with only one
equations and an optimized numerical solver. of its walls exposed to the exterior climate, while the remaining parts of
The verification of hierarchical system models must include separate the OTC envelope were close to adiabatic [54]. All test buildings were
verification of their sub-models, but also verification at the system level unoccupied; Climate conditions could be estimated based on the co­
[47]. The level of detail, i.e. resolution, may differ between two ordinates given for the test locations; Data on geometry and thermal
sub-models in terms of geometric detail (spatial resolution) or time step properties of the building are usually given in the references.
(temporal resolution). Numerical models are using a geometrical mesh. Despite the importance of the concept of VD in the Sandia V&V
The mesh size and location of mesh nodes should ideally match at the framework, the latter provides very little guidance on how to select the
interface between two coupled models. Similarly, it is preferred that the 2–3 model input variables needed for visualization of the validation
time step sequences, i.e. the temporal meshes, match at the interface domain in a 2D or 3D diagram. For BEM, this selection is problematic
between models. When spatial or temporal mismatch occurs at the because it commonly contains a large number of model input variables

7
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

Table 3
Validation Experiments for the time period 1988–2018, and in the order of increasing latitude.
Project Location Reference Year Test object QOI UA type Estimates/Comments

Univ. La Réunion Island France, 21S/55E Mara et al. [71] 2001 OTC Ti DSA, ResA uD, E, statistical testing
City univ Hong Kong China, 22 N/114E Chan et al. [72] 2009 OTC Ti S(t) and D(t) /DSF
Univ. Kingsville USA, 27 N/98W Hernandez et al. [62] 2003 OTC HC E
Univ. Lisbon Portugal, 39 N/9W Mateus et al. [73,74] 2014/16 OTC Ti E /DSF
Princeton university USA, 40 N/74W Rabl [19] 1988 FSTB HC, RC GBM (RC-net), BBM (ARMA)
Univ. Genua Italy, 44 N/9E Bagnasco et al. [65] 2018 FSTB Ti, HC E, S(t) and D(t)
NRC, Ottawa Canada, 45 N/75W Laouadi et al. [75] 1999 FSTB Ti, Sol S(t) and D(t) /Atrium
EDF France, 45 N/1E Déqué et al. [53] 2000 FSTB Ti, HC E, WBM and GBM
ANR/BAYREB, INES France, 45 N/6E Rouchier et al. [76] 2018 FSTB Ti, RC CV GBM
IEA Task 34 Switzerland, 47 N/9E Manz et al. [69] 2006 2 OTC Ti MCA |E| is compared to uD + uinput
/annex 43 /EMPA Loutzenhiser et al. [67,68] 2007/09 OTC P MCA |E| is compared to uD + uinput
FFG /MPC-boxes Austria, 47 N/15E Nageler et al. [77] 2018 2 OTC; Ti E
CEC /PASSYS Germany, 48 N/9E Jensen [78,79], 1993/95 OTC Ti DSA, ResA overview of methodology
ETNA /EDF France, 48 N/2E Aude et al. [60] 2000 OTC Ti Adjoint SA Uncertainty limits given
ETNA /EDF France, 48 N/2E del Barrio et al. [55,56] 2003/04 2 OTC Ti DSA, ResA D(t), uD, uS, diagnosis
ETNA /EDF France, 48 N/2E Neymark et al. [63] 2005 2 OTC HC S(t) and D(t)
ETNA /EDF France, 48 N/2E Ramdani et al. [80] 2006 OTC Ti DSA System. & random UA
BESTLab /EDF France, 48 N/3E Bontemps et al. [81] 2013 OTC Ti DSA S(t) and D(t)
IEA ECB annex 58 Germany, 48 N/12E Rehab et al. [82] 2015 2 FSTB Ti ResA
IEA ECB annex 58 Germany, 48 N/12E Strachan et al. [64] 2016 FSTB Ti, HC E, uD/D = ±1.5% (for H)
IEA/SHC Task 12 UK, 52 N/0W Lomas et al. [66] 1997 3 OTC Ti, HC MCA, DSA uD, uinput, E
/ECBCS annex 21
CEC /PASSYS UK, 55 N/5W Palomo et al. [58] 1991 OTC Ti DSA, ResA E
CEC /PASSYS UK, 56 N/4W Strachan [40], 1993 12 sites [83] OTC Ti DSA overview
DTU/Denmark Lyngby, 56 N/13E Madsen et al. [84,85] 1995/00 FSTB Ti, RC ResA GBM
DTU/Denmark Risö, 56 N/12E Bacher et al. [86] 2011 FSTB Ti, RC ResA GBM
Aalborg university Denmark, 57 N/10E Dama et al. [87] 2017 OTC Ti, ϑ E, S(t) and D(t) /DSF
Umeå university Sweden, 63 N/20E Allard et al. [70] 2018 FSTB HC DSA E, systematic errors, GBM
DTU/Denmark Greenland, 67 N/54W Andersen et al. [88,89] 2013/14 FSTB Ti, RC ResA GBM

Outdoor test cell (OTC) facilities and full-scale test buildings (FSTB) were used in the VEs. Type of uncertainty analysis (UA): Differential sensitivity analysis (DSA),
Monte Carlo analysis (MCA), residual analysis (ResA), and cross-validation index (CV). Quantity of Interest (QOI) for prediction: Ti (◦ C) is the indoor temperature, HC
(MJ) is the energy need for heating or cooling of building space to a preset temperature, P (W) is the cooling power, Sol is the solar radiation, ϑ is the volumetric flow
rate, and RC is the BEM parameters R (resistance) and C (thermal capacity). GBM is grey-box model, and BBM is black-box model. DSF is double-skin façade.

(in the order of between 102 and 104), including e.g. building geometry, Correlated effects between input variables are neglected by using Eq.
building material properties, and outdoor climate boundary conditions. (9). Despite of this, uin is often assumed to be a reasonable estimate of
Most of those variables have very small individual effects upon the uinput. On the contrary, MCA yields directly an estimate of uinput including
model prediction of QOI, while an aggregated variable (AgV) may have correlated effects. However, MCA does not resolve the individual effects
a combined major effect upon the QOI prediction. If 2–3 dominant AgV (uin,i), and therefore does not allow the identification of significant
could be identified, then these could be used for visualization of the VD. sources to uinput. MCA requires the estimation of the probability distri­
For BEM, the dominant AgVs could potentially be identified based on bution for all individual input variables, and a large number of simu­
insight into building physics. For example, the wall-to-window area lations. It requires that all inputs are varied randomly and
ratio potentially has a major effect on the prediction of building HC. simultaneously in between simulations. In addition, there is the sto­
For the VEs listed in Table 3, the concept of VD has not been applied. chastic SA technique, which involves the simultaneous variation of all
However, in Refs. [55,56] the dominant AgVs were identified using input variables for each time-step of one single simulation [57]. This
principal component analysis (PCA), combined with insights from yields all individual sensitivities as well as auto-covariance and
building physics. Because of its correlation with climate, the latitude of cross-covariance functions between input variables. There is also the
the VE is a potential candidate for an AgV to describe the VD. We note adjoint-code method by Ref. [60], which provides all input sensitivities
from Table 3, that a majority of VE were located between latitude 40 N in one single simulation, but this comes to the cost of a difficult imple­
and 57 N. mentation, which requires duplicated coding. For evaluation of model
In the benchmark CS- and Sandia V&V frameworks, UA includes inadequacy, residual analysis (ResA) has been applied for the analysis of
quantification of E, uD, uinput and unum. For the VE listed in Table 3, es­ the statistical properties of residuals (= E) time series data [58].
timates of unum are missing (cf. also section 3.1), estimates of uD are In the final part of this section, the VEs, listed in Table 3, where QOI
sometimes provided, while estimates of E and uinput are often, but not = HC, are reviewed in the perspective of the benchmark CS- and Sandia-
always, given. V&V frameworks as follows:
For most VEs in Table 3, uinput was estimated based on two different
SA techniques: differential sensitivity analysis (DSA) or Monte Carlo • There are several VEs (Moinard et al. [61], 1999; Déqué et al. [53],
analysis (MCA) [57–59]. In this terminology, SA includes both para­ 2000; Hernandez et al. [62], 2003; Neymark et al. [63], 2005;
metric studies, where input variables vary over larger ranges, and error Strachan et al. [64], 2016; Bagnasco et al. [65], 2018), where only
analysis, where the changes have magnitudes corresponding to input the validation comparison error E is given, without any estimation of
uncertainties. DSA yields uin,i the individual sensitivity, which is the ef­ uinput or unum, and often without estimation of uD. One conclusion
fect on the model prediction S of a change in the i:th input variable equal from these studies is that the relative magnitude of E commonly is
to its uncertainty ui.The total sensitivity, i.e. the effect on S from changes between 10 and 30%.
in all input variables, is then obtained by • Rabl [19], in 1988, reviewed BBM and GBM (RC-network models)
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅
∑ for prediction of energy use in buildings, and validated a BBM
uin = u2in,i (9) (ARMA type) using experimental data from a FSTB. There was a brief
i
discussion of UA, pointing to difficulties of estimation of systematic

8
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

errors, which cause a bias that remains hidden. The issue of correct relevant European (CEN) standards are applied. Therefore the BEM
treatment of systematic errors is further treated in the Sandia V&V model specified by the CEN ISO 52016-1 standard [90], issued in year
framework. 2017, could potentially be applied for the prediction of the energy
• Lomas et al. [66], in 1997, performed VE, using 24 different BEM needed for heating or cooling of buildings within some EU member
software/modeler combinations. Two OTCs were used, with their states. The CEN ISO 52016-1 standard replaces the previous CEN ISO
south-facing façade window opening covered with either 13790 standard [91] from the year 2008. These two standards include
double-glazing or with an opaque layer. The OTCs were heated various types of models, among which we selected the hourly models of
intermittently for a 10-day period. In terms of insulation, thermal the CEN ISO 52016–1 and CEN ISO 13790 for the case study, because of
mass, and window-to-floor area ratio, the test rooms were repre­ their conceptual and mathematical simplicity.
sentative of typical lightweight rooms in UK houses. The uncertainty The “simple hourly method” of ISO 13790 describes a 5R1C thermal
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
band (UB = k⋅ u2D + u2input , with k selected to give 99% confidence network model for a ventilated single thermal zone of the building,
where solar radiation is admitted into the building through a glazed
level) was estimated from the measurement uncertainty uD, and with window [91,92]. This dynamic 5R1C network model, shown in Fig. 4,
uinput obtained from both MCA and DSA using one single BEM soft­ consists of five thermal resistances (5R) and one thermal capacitance
ware. For example, for the OTC with double glazed window modeled (1C), the latter located in the opaque envelope element of the zone. The
using the SERI-RES v1.2 BEM software, the simulated heating de­ simplicity of the 5R1C model enable fast computation of the heating or
mand was S = 85 MJ, and the measured data D = 90 MJ. The 99% cooling load, with an hourly resolution, over long periods of time (e.g.
uncertainty band UB = 14 MJ was asymmetrically positioned around over several heating seasons).
D: [79 MJ, 93 MJ], as a result of MCA. In the CS-V&V framework, this The BEM model of the “hourly calculation procedure”, in ISO
means that E = S − D = − 5 MJ, which lies within UB. However, in 52016–1, is also a RC thermal network model, but in this case, each zone
this study [66], there is no estimation of the numerical uncertainty construction element (roof, floor, wall, window) is modeled separately,
unum , which is necessary for the calculation of uval , as required by with 5 temperature nodes for opaque elements and 2 nodes for windows
CS-V&V. This means that in cases where E falls outside the uncer­ and doors. This adds significantly to model complexity when compared
tainty band, which was observed in 11 out of 24 simulation results, E to the previous 5R1C model, but still allows high speed computation in
is caused by both numerical solution and model errors, whose rela­ comparison to existing more detailed simulation models.
tive magnitudes are unknown. Finally, the average of the 24 simu­ Table 4 compiles V&V studies performed on the hourly models of ISO
lation results was 75 MJ (±22 MJ, 95% confidence interval), which 13790 and ISO 52016–1. It was concluded that these models had not
indicates the level of accuracy to be expected from BEM simulations been validated against measurement data, except in one case [93] for
in the 1990s. ISO 13790. In this latter case, the building parameters (R and C values)
• In two studies, Loutzenhiser et al. ([67], 2007; [68], 2009) per­ were not calculated as specified in the ISO 13790 standard (from
formed VEs on twin OTCs for prediction of the cooling power P (W) building geometry and thermal properties), but instead estimated by
required to maintain a constant interior temperature. Their evalua­ calibration using experimental time series data. Within the benchmark
tion methodology [69] included first the estimation of E, uD , and CS- and Sandia- V&V frameworks, validation requires, by definition,
uinput , the latter quantity obtained by MCA. Second, the uncertainty comparison of simulated and measured QOI results, and also an UA,
ratio UR was calculated as: where uD, uinput, and unum are estimated. These elements are largely
|E| absent in the works listed in Table 4.
UR = (10) The content of Table 4 indicates two additional issues, one termi­
k⋅uD + k⋅uinput
nological and one cultural issue, which have relevance for the devel­
where k = 1.96 for 95% confidence level. If UR ≤ 1 then the prediction S opment of standards for validation of BEM, in alignment with the
is within the uncertainty band of the VE. This methodology differed with benchmark V&V frameworks, and as part of the BIM initiative.
regards to two procedural steps from that given in the CS-V&V frame­ The terminological issue is about the definitions of the terms
work: (i) the numerical solution uncertainty unum was not included, and
(ii) uncertainties are directly summed, not by the root-sum-of-squares
method. The omission of unum in Eq. (10) leaves it unresolved to what
extent E is caused by numerical solution error or by model error. In these
two studies, the relative magnitude of E was between 4 and 14%, using
four different BEM software. However, we note that these results cannot
be directly compared to those of Lomas et al. [66], since the QOIs are
different, P and HC, respectively.

• In a recent study by Allard et al. [70], the energy signature GBM was
applied to predict the energy use for space heating of a full scale
residential building. The causes of the BEP gap was investigated
using SA to estimate the systematic errors due to various sources.
Similar to the Sandia V&V framework, the Allard study recognizes
the need for separate treatment of systematic errors, while their
general treatment of uncertainties deviates from that of both CS- and
Sandia V&V frameworks.

4. Case study on CEN ISO 13790 and 52016-1

In this case study, two BEM standard models were critically reviewed Fig. 4. Simple hourly 5R1C single thermal zone model of ISO 13790. The
with regards to their validation and UA, and from the perspective of the temperature (T) node subscripts denote supplied air (sup), exterior air (e),
benchmark CS- and Sandia- V&V frameworks. interior air (i), interior surface (s), and opaque envelope mass (m). QHC is the
In the EU Directive 2010/31/EU [4], on BEP, it is required that supplied power for heating or cooling, and Qis is the flux from internal and solar
heat sources, where the latter is split amongst three nodes.

9
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

Table 4 referenced back to previous scientific knowledge by using science


Verification and validation of CEN ISO standards for calculation of heating and citations.
cooling energy needs. We believe that the cultural differences between TC and SC are re­
ISO 13790: 2008 ISO 52016–1: 2017 flected by two features of the content of Table 4: First, the absence of
Simple hourly method Hourly calculation procedure validation studies for the hourly models of ISO 13790 and ISO 52016–1
Reference Comment Reference Comment (except for one study [93], where the model had been modified by
calibration) possibly indicate that there is no scientific consensus on
Verification
how to validate these models for prediction of the energy need for
ISO 13790, EN 15265 test ISO 52016–1, Based on BESTEST heating or cooling. Without scientific consensus, TC should be less likely
Appendix H.4 cases [43]; section 7.2.2.1. 600 and 900 test
[91], verification test case series, ANSI/
to reach consensus since academic professionals are often members of
cases [90]; ASHRAE 140 [96], TC. One possible reason for the absence of a consensus within the SC was
Kokogiannakis Comparison to Lundström, Comparison to IDA given by Ghiaus [95], who questioned that this kind of BEM could be
et al., 2008, ESP-r and Energy- 2019 [98], ICE. Modified RC- used for accurate predictions of the need for heating energy. The argu­
[97]. Plus. Building: 3 model, based on
ment was that this type of model does not respect the causality of the
stories; 336 m2. ISO 52016–1.
Millet, 2007, [99]. EN 15265 test Zakula, 2019 Comparison to physical system, and that this could lead to inaccurate predictions.
Science citations cases; [100], TRNSYS. 10 Second, the substitution of ISO 52016–1 for the previous ISO 13790
missing. Comparison to 5 reference was decided by the TC, with the claim that the new model provides a
software. buildings; Heating “more robust, transparent, easier and reliable treatment of dynamic
Atmaca et al., Comparison to load differed 11%
2011, [101]. Energy-Plus. on average; main
interactions” (ref. [94], §5.2.3). There were no data or no science cita­
Science citations Building: 65 m2. cause was window tions given to support this claim by the TC. Within the SC the model
missing. modelling substitution should have been motivated by giving experimental evi­
Kristensen et al., Sensitivity dence, which shows the inadequacy of the previous model. Connection
2016 [102], analysis
to previous scientific knowledge, by using science citations, is also
Michalak, 2017, Comparison to
[103]. Energy-Plus. mandatory within SC, while it is absent in these standards.
Building: 156 m2;
20 European 5. Conclusions
locations.

Validation Based on the present review, in this section conclusions are drawn,
Burhenne, 2008 Building: 436 m2 that have relevance for both the BIM initiative and for the SC, in their
[93], Model modified efforts to reduce the BEP gap.
by calibration
using measured
data.
5.1. Strengthening the BIM initiative

“verification” and “validation”. These terms are defined differently Based on the findings of the present review, three suggestions are
within the building sector compared to within the benchmark CS- and given in the following:
Sandia- V&V frameworks, and in current ASME and AIAA standards (see
section 2). The ISO 52016–1:2017 standard states that the model has 1. Standards on BEM should always include science citations referring
been “verified” (against theoretical test cases), while the previous ISO to works where the model V&V procedure is documented, including
13790:2008 claims that the model has been “validated” (against theo­ a description of relevant VEs. Science citations are necessary if the
retical test cases). In the latter case, we would define this as a verifica­ standard BEM is to be recognized as based on science. Science cita­
tion activity. In the former case, it may appear that the authors have tions also enable scientific peer-review as a method for quality
changed their definitions of the V&V terms. However, the reason for the assurance. In addition, the inclusion of science citations may increase
substitution of “verification” for “validation” is because the calculation the interest of members of SC in participating in the development of
procedures are fully described in the ISO 52016–1, while in the ISO standards, since a high citation rate is one important benchmark in
13790 there are different calculation options (explicitly explained in the assessment of scientific performance.
Ref. [94]). Obviously, in ISO 13790, the model is still considered to have 2. A standard vocabulary for V&V of BEM should be created. Within the
been validated (although VEs are absent). area of measurement science (metrology), there exists a standard
The cultural issue stems from the fact that two different communities vocabulary [104], which plays an important role for the consistency
are involved in V&V of BEMs: (i) The industrial standardization com­ of this domain of science. It is particularly important to clearly
munity, which forms a technical committee (TC) for the development of distinguish between verification and validation, since within build­
each new standard. The TC members are technical experts from industry ing science there is considerable confusion and misconceptions about
and other stakeholders, often including academia. Standards are these terms. For example, if results from the simple hourly ISO 13790
developed to facilitate industrial production and the exchange of goods model are compared to simulation results obtained from a reference
and services on the market. The decision to develop a new standard is simulation using a more detailed model, and this activity is classified
based on a need of standardization recognized by the actors on this as “validation”, then the user of the standard is misled to believe in
market. The approval of a standard for publication requires consensus something not in agreement to accepted practice in other important
among the members of the TC. The strength of this procedure is that the scientific domains. Another example, is the misconception that BEM
approved standard rests on the full agreement of all partners partici­ software tools could be validated once and for all. The BEM model of
pating in the TC. (ii) The scientific community (SC), whose mode of a particular building, including boundary conditions, can be the
operation is characterized by: (1) obtaining empirical evidence to either subject of validation. However, the BEM software only provides an
support or falsify a theoretical model, i.e. to perform model validation; environment for design of BEMs, but cannot be subject to a general
(2) consensus, e.g. on the validity of a BEM, is rarely an explicit objec­ validation of all BEMs that could potentially be created within this
tive, although it may emerge as the final results of efforts by individual environment. Again, the BEM modeler can be misled trusting the
members of the SC, and (3) newly published scientific results are always BEM software to deliver accurate model predictions, while in fact
there may be no experimental grounds for this trust.

10
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

3. The IFC standard should be applied to existing VEs, to document the and by the Kolarctic CBC project: KO1089 Green Arctic Building (GrAB),
3D building geometry in terms of a unique IFC data structure. The under section A2 – Comparative analysis.
BIM initiative has brought forward a vision of a common language
for data sharing in the construction and facility management in­ References
dustries, realized in the IFC of the ISO 16739 standard. By doc­
umenting VEs using the IFC standard, potential errors in the coding [1] IPCC. Climate change 2014. In: Pachauri RK, Meyer LA, editors. Synthesis report.
Contribution of working groups I, II and III to the fifth assessment report of the
of building geometry for V&V of BEM are eliminated. With the intergovernmental panel on climate change, 2014. Geneva, Switzerland; 2014.
addition of supplementary information, such as data on thermal [2] Pérez-Lombard L, Ortiz J, Pout C. A review on buildings energy consumption
properties and boundary conditions, the VE documentation could information. Energy Build 2008;40:394–8.
[3] Isaac M, van Vuuren DP. Modeling global residential sector energy demand for
further facilitate the validation of BEM models. heating and air conditioning in the context of climate change. Energy Pol 2009;
37:507–21.
5.2. Scientific challenges [4] Directive 2010/31/EU on the energy performance of buildings (recast). European
Parliament and Council of the European Union; 2010.
[5] De Wilde P. The gap between predicted and measured energy performance of
Based on conclusions from the present review, the following scien­ buildings: a framework for investigation. Autom ConStruct 2014;41:40–9.
tific challenges were identified: [6] Tuohy PG, Murphy GB. Closing the gap in building performance: learning from
BIM benchmark industries. Architect Sci Rev 2015;58:47–56.
[7] Rees SJ. Closing the performance gap through better building Physics. Build Serv
1. There is no consensus within the SC on the procedure for V&V of Eng Res Technol 2017;38:125–32.
BEM. This is indicated for BEM in general, but in particular for [8] Kensek KM, Noble D. Building information modeling: BIM in current and future
practice. Hoboken, NJ: Wiley; 2014.
models intended for the prediction of energy need for heating or
[9] Eastman C, Teicholz P, Sacks R, Liston K. BIM Handbook: a guide to building
cooling of the building space. This conclusion is based (i) on the information modeling for owners, managers, designers, engineers, and
review of existing validation studies (cf. Table 3), where several contractors. Hoboken, NJ: Wiley; 2011.
different validation methodologies were applied, and (ii) on the [10] Miettinen R, Paavola S. Beyond the BIM utopia: approaches to the development
and implementation of building information modeling. Autom ConStruct 2014;
absence of validation studies for the hourly models of the ISO 13790 43:84–91.
and ISO 52016-1 standards (cf. Table 4). Items included in CEN or [11] Volk R, Stengel J, Schultmann F. Building Information Modeling (BIM) for
ISO standards are included based on consensus vote. Therefore, the existing buildings - literature review and future needs. Autom ConStruct 2014;38:
109–27.
absence of an item may indicate that consensus was not reached on [12] Menezes AC, Cripps A, Bouchlaghem D, Buswell R. Predicted vs. actual energy
the inclusion of that item, although the item could have been omitted performance of non-domestic buildings: using post-occupancy evaluation data to
for other reasons. To achieve a higher degree of consensus on V&V reduce the performance gap. Appl Energy 2012;97:355–64.
[13] Johnston D, Miles-Shenton D, Farmer D. Quantifying the domestic building fabric
procedures for BEM, one way forward would be to adopt parts of the ‘performance gap’. Build Serv Eng Res Technol 2015;36:614–27.
CS- and Sandia V&V frameworks. Another approach towards [14] ISO 16739. Industry Foundation Classes (IFC) for data sharing in the construction
consensus is to resolve issues where conflicting views hinder. For and the facilities management industries. Brussels: CEN; 2013.
[15] Gao H, Koch C, Wu Y. Building information modelling based building energy
examples, there exist the following conflict issues on V&V for BEM:
modelling: a review. Appl Energy 2019;238:320–43.
(i) Procedure for quantification of unum, (ii) Procedure for propaga­ [16] Ostergård T, Jensen RL, Maagaard SE. Building simulations supporting decision
tion of systematic and random errors, (iii) Selection of suitable making in early design - a review. Renew Sustain Energy Rev 2016;61:187–201.
[17] Burhenne S, Tsvetkova O, Jacob D, Henze GP. Uncertainty quantification for
validation metric, and (iv) Criteria for evaluation of VE results.
combined building performance and cost-benefit analyses. Build Environ 2013;
2. The concept of a VD has not yet been applied to BEM. Mapping of the 62:143–54.
VD is useful both in the planning of a VE, and later when judging if [18] Foucquier A, Robert S, Suard F, Stéphan L, Jay A. State of the art in building
the model application point lies sufficiently close to existing VEs. The modelling and energy performances prediction: a review. Renew Sustain Energy
Rev 2013;23:272–88.
size of the prediction error could be estimated by regression methods [19] Rabl A. Parameter estimation in buildings: methods for dynamic analysis of
applied on existing VEs forming the VD. Here, the important future measured energy use. J Sol Energy Eng 1988;110:52–66.
research questions are: (i) How to identify the AgVs that span the [20] Coakley D, Raftery P, Keane M. A review of methods to match building energy
simulation models to measured data. Renew Sustain Energy Rev 2014;37:123–41.
VD? (ii) How does the QOI and the uncertainties change with dis­ [21] Fumo N. A review on the basics of building energy estimation. Renew Sustain
tance from the VEs forming the VD? Energy Rev 2014;31:53–60.
[22] Harish VSKV, Kumar A. A review on modeling and simulation of building energy
systems. Renew Sustain Energy Rev 2016;56:1272–92.
3. Procedures for model order reduction should be optimized based [23] Andriamamonjy A, Klein R, Saelens D. Automated grey box model
on UA, where the uncertainty of the simulation is resolved in its com­ implementation using BIM and Modelica. Energy Build 2019;188–189:209–25.
ponents, as in the CS-V&V framework. Transfer of data from a 3D model [24] ASME. Guide for the verification and validation in computational solid
mechanics. American Society of Mechanical Engineers; 2006. ASME V&vols. 10-
of building geometry and into a BEM model often involves the reduction
2006.
of the spatial complexity of the model, i.e. reduction of model order. The [25] AIAA. Guide for the verfication and validation of computational fluid dynamics
simplification of BEM by model order reduction techniques could simulations. American Institute of Aeronautics and Astronautics; 1998. AAIA-G-
077-1998.
potentially have an effect on the magnitudes of all component errors of
[26] ASME. Standard for verification and validation in computational fluid dynamics
the simulation error δS, i.e. δmodel, δnum, and δinput, and thereby affect the and heat transfer. American Society of Mechanical Engineers; 2009. ASME
accuracy of the model prediction. V&vols. 20-2009.
[27] Oberkampf WL, Trucano TG. Verification and validation benchmarks. Nucl Eng
Des 2008;238:716–43.
Declaration of competing interest [28] Coleman HW, Steele WG. Experimentation, validation, and uncertainty analysis
for engineers. third ed. Wiley; 2009.
The authors declare that they have no known competing financial [29] Coleman HW, Stern F. Uncertainties and CFD code validation. J Fluid Eng 1997;
119:795–803.
interests or personal relationships that could have appeared to influence [30] Oberkampf WL, Roy CJ. Verification and validation in scientific computing.
the work reported in this paper. Cambridge: Cambridge University Press; 2010.
[31] Roy CJ, Oberkampf WL. A comprehensive framework for verification, validation,
and uncertainty quantification in scientific computing. Comput Methods Appl
Acknowledgements Mech Eng 2011;200:2131–44.
[32] Oberkampf WL, Trucano TG, Hirsch C. Verification, validation, and predictive
Funding: This work was supported by the program 2014–2020 capability in computational engineering and physics. Appl Mech Rev 2004;57:
345–84.
INTERREG V-A Sweden-Finland-Norway (Nord) project ICNB [33] Oberkampf WL, Barone MF. Measures of agreement between computation and
(Increasing Competence in Northern Building), under workgroup 4: experiment: validation metrics. J Comput Phys 2006;217:5–36.
Energy efficiency management in model-based building information,

11
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

[34] JCGM 100. Evaluation of measurement data - guide to the expression of [68] Loutzenhiser PG, Manz H, Moosberger S, Maxwell GM. An empirical validation of
uncertainty in measurement (GUM 1995 with minor correction). Geneva: BIPM, window solar gain models and the associated interactions. Int J Therm Sci 2009;
ISO; 2008. 48:85–95.
[35] Roache PJ. Verification and validation in fluids engineering: some surrent issues. [69] Manz H, Loutzenhiser PG, Frank T, Strachan PA, Bundi R, Maxwell G. Series of
J Fluid Eng 2016;138:101205. experiments for empirical validation of solar gain modeling in building energy
[36] Ferson S, Ginzburg LR. Different methods are needed to propagate ignorance and simulation codes - experimental setup, test cell characterization, specifications
variability. Reliab Eng Syst Saf 1996;54:133–44. and uncertainty analysis. Build Environ 2006;41:1784–97.
[37] Crawley DB, Hand JW, Kummert M, Griffith BT. Contrasting the capabilities of [70] Allard I, Olofsson T, Nair G. Energy evaluation of residential buildings:
building energy performance simulation programs. Build Environ 2008;43: performance gap analysis incorporating uncertainties in the evaluation methods.
661–73. Build Simul 2018;11:725–37.
[38] Judkoff R, Wortman D, O’Doherty B, Burch J. A methodology for validating [71] Mara TA, Garde F, Boyer H, Mamode M. Empirical validation of the thermal
building energy analysis simulations. 2008. NREL/TP-550-42059. Originally model of a passive solar cell test. Energy Build 2001;33:589–99.
published in 1983. [72] Chan ALS, Chow TT, Fong KF, Lin Z. Investigation on energy performance of
[39] Judkoff R, Neymark J. Model validation and testing: the methodological double skin facade in Hong Kong. Energy Build 2009;41:1135–42.
foundation of ASHRAE Standard 140. 2006. NREL/CP-550-40360. [73] Mateus NM, Pinto A, Carrilho da Graca G. Validation of EnergyPlus thermal
[40] Strachan P. Model validation using the PASSYS test cells. Build Environ 1993;28: simulation of a double skin naturally and mechanically ventilated test cell.
153–65. Energy Build 2014;75:511–22.
[41] Strachan P, Kokogiannakis G, Macdonald IA. History and development of [74] Oliveira Panao MJN, Santos CAP, Mateus NM, Carrilho da Graca G. Validation of
validation with the ESP-r simulation program. Build Environ 2008;43:601–9. a lumped RC model for thermal simulation of a double skin natural and
[42] Judkoff R, Neymark J. Twenty years on!: updating the IEA BESTEST building mechanical ventilated test cell. Energy Build 2016;121:92–103.
thermal fabric test cases for ASHRAE standard 140. 2013. NREL/CP-5500-58487. [75] Laouadi A, Atif MR. Comparison between computed and field measured thermal
[43] EN 15265. Energy performance of buildings - calculation of energy needs for parameters in an atrium building. Build Environ 1999;34:129–38.
space heating and cooling using dynamic methods - general criteria and [76] Rouchier S, Rabouille M, Oberlé P. Calibration of simplified building energy
validation procedures. Brussels: CEN; 2007. models for parameter estimation and forecasting: stochastic versus deterministic
[44] Bland BH. Conduction in dynamic thermal models: analytical tests for validation. modelling. Build Environ 2018;134:181–90.
Build Serv Eng Res Technol 1992;13:197–208. [77] Nageler P, Schweiger G, Pichler M, Brandl D, Mach T, Heimrath R, et al.
[45] Rees SJ, Xiao D, Spitler JD. An analytical verification test suite for building fabric Validation of dynamic building energy simulation tools based on a real test-box
models in whole building energy simulation programs. ASHRAE Trans 2002;108 with thermally activated building systems (TABS). Energy Build 2018;168:42–55.
(1):30–42. [78] Jensen SO. Empirical whole model validation case study: the PASSYS reference
[46] Judkoff R, Neymark J. International Energy Agency building energy simulation wall. Building Simulation 1993:335–41. IBPSA; 1993.
test (BESTEST) and diagnostic method. NREL/TP-472-6231. NREL; 1995. [79] Jensen SO. Validation of building energy simulation programs: a methodology.
[47] Stevens G, Atamturktur S. Mitigating error and uncertainty in partioned analysis: Energy Build 1995;22:133–44.
a review of verification, calibration and validation methods for coupled [80] Ramdani N, Candau Y, Guyon G, Dalibart C. Sensitivity analysis of dynamic
simulations. Arch Comput Methods Eng 2017;24:557–71. models to uncertainties in inputs data with time-varying variances.
[48] Clarke JA. Domain integration in building simulation. Energy Build 2001;33: Technometrics 2006;48:74–87.
303–8. [81] Bontemps S, Kaemmerlen A, Blatman G, Mora L. Reliability of dynamic
[49] Sahlin P, Eriksson L, Grozman P, Johnsson H, Shapovalov A, Voulle M. Whole- simulation models for building energy in the context of low-energy buildings.
building simulation with symbolic DAE equations and general purpose solvers. BS2013: 13th Conference of International Building Performance Simulation
Build Environ 2004;39:949–58. Association. Chambéry, France2013. p. 1952-1959.
[50] Wetter M. Modelica-based modelling and simulation to support research and [82] Rehab I, André P. Energy performance characterization of the test case "twin
development in building energy and control systems. J Build Perform Simul 2009; house. In: Holzkirchen, based on TRNSYS simulation and grey box model.
2:143–61. Building simulation, 14th conference of international building performance
[51] Sodja A, Zupancic B. Modelling thermal processes in buildings using an object- simulation association. India: Hyderabad; 2015. p. 2401–8.
oriented approach and Modelica. Simulat Model Pract Theor 2009;17:1143–59. [83] Wouters P, Vandaele L, Voit P, Fisch N. The use of outdoor test cells for thermal
[52] Wetter M, Zuo W, Nouidui TS, Pang X. Modelica buildings library. J Build and solar building research within the PASSYS project. Build Environ 1993;28:
Perform Simul 2014;7:253–70. 107–13.
[53] Déqué F, Ollivier F, Poblador A. Grey boxes used to represent buildings with a [84] Madsen H, Holst J. Estimation of continuous-time models for the heat dynamics
minimum number of geometric and thermal parameters. Energy Build 2000;31: of a building. Energy Build 1995;22:67–79.
29–35. [85] Andersen KK, Madsen H, Hansen LH. Modelling the heat dynamics of a building
[54] Cattarin G, Causone F, Kindinis A, Pagliano L. Outdoor test cells for building using stochastic differential equations. Energy Build 2000;31:13–24.
envelope experimental characterisation - a literature review. Renew Sustain [86] Bacher P, Madsen H. Identifying suitable models for the heat dynamics of
Energy Rev 2016;54:606–25. buildings. Energy Build 2011;43:1511–22.
[55] Palomo Del Barrio E, Guyon G. Theoretical basis for empirical model validation [87] Dama A, Angeli D, Kalyanova Larsen O. Naturally ventilated double-skin facade
using parameters space analysis tools. Energy Build 2003;35:985–96. in modeling and experiments. Energy Build 2017;144:17–29.
[56] Palomo Del Barrio E, Guyon G. Application of parameters space analysis tools for [88] Andersen PD, Rode C, Madsen H. An arctic low-energy house as experimental
empirical model validation. Energy Build 2004;36:23–33. setup for studies of heat dynamics of buildings. Frontiers Architect Res 2013;2:
[57] Lomas KJ, Eppel H. Sensitivity analysis techniques for building thermal 488–99.
simulation programs. Energy Build 1992;19:21–44. [89] Andersen PD, Jiménez MJ, Madsen H, Rode C. Characterization of heat dynamics
[58] Palomo E, Marco J, Madsen H. Methods to compare measurements and of an arctic low-energy house with floor heating. Build Simul 2014;7:595–614.
simulations. Conference on Building Simulation, IBPSA Nice, France1991. p. 570- [90] ISO 52016-1. Energy performance of buildings - energy needs for heating and
577. cooling, internal temperatures and sensible and latent heat loads - Part 1:
[59] Tian W, Heo Y, de Wilde P, Li Z, Yan D, Park CS, et al. A review of uncertainty calculation procedures. Brussels: CEN; 2017.
analysis in building energy assessment. Renew Sustain Energy Rev 2018;93: [91] ISO 13790. Energy performance of buildings - calculation of energy use for space
285–301. heating and cooling. Brussels: CEN; 2008.
[60] Aude P, Tabary L, Depecker P. Sensitivity analysis and validation of buildings’ [92] Michalak P. The simple hourly method of EN ISO 13790 standard in Matlab/
thermal models using adjoint-code method. Energy Build 2000;31:267–83. Simulink: a comparative study for the climatic conditions of Poland. Energy 2014;
[61] Moinard S, Guyon G. Empirical validation of EDF ETNA and GENEC test-cell 75:568–78.
models. Report of Task 22. Building energy analysis tools. International Energy [93] Burhenne S, Jacob D. Simulation models to optimize the energy consumption of
Agency; 1999. buildings. Eighth international conference for enhanced building operations.
[62] Hernandez M, Medina MA, Schruben DL. Verification of an energy balance 2008. Berlin, Germany.
approach to estimate indoor wall heat fluxes using transfer functions and [94] ISO/TR 52016-2. Energy performance of buildings - energy needs for heating and
simplified solar heat gain calculations. Math Comput Model 2003;37:235–43. cooling, internal temperatures and sensible and latent heat loads - Part 2:
[63] Neymark J, Girault P, Guyon R, Judkoff R, LeBerre R, Ojalvo J, et al. The "ETNA explanation and justification of ISO 52016-1 and ISO 52017-1. Geneva: ISO;
BESTEST" empirical validation data set. Building Simulation 2005. Ninth 2017.
International IBPSA Conference. Montréal, Canada; 2005. p. 839–46. [95] Ghiaus C. Causality issue in the heat balance method for calculating the design
[64] Strachan P, Svehla K, Heusler I, Kersken M. Whole model empirical validation on heating and cooling load. Energy 2013;50:292–301.
a full-scale building. J Build Perform Simul 2016;9:331–50. [96] ANSI/ASHRAE standard 140, Standard method of test for the evaluation of
[65] Bagnasco A, Massucco S, Saviozzi M, Silvestro F, Vinci A. Design and validation of building energy analysis computer programs. 2014.
a detailed building thermal model considering occupancy and temperature [97] Kokogiannakis G, Strachan P, Clarke J. Comparison of the simplified methods of
sensors. Palermo, Italy: IEEE 4th Int Forum on Res and Techn for Society and the ISO 13790 standard and detailed modelling programs in a regulatory context.
Industry (RTSI); 2018. p. 8548424. J Build Perform Simul 2008;1:209–19.
[66] Lomas KJ, Eppel H, Martin CJ, Bloomfield DP. Empirical validation of building [98] Lundström L, Akander J, Zambrano J. Development of a space heating model
energy simulation programs. Energy Build 1997;26:253–75. suitable for the automated model generation of existing multifamily buildings - a
[67] Loutzenhiser PG, Manz H, Felsmann C, Strachan PA, Maxwell GM. An empirical case study in Nordic climate. Energies 2019;12:485.
validation of modeling solar gain through a glazing unit with external and [99] Millet J-R. The simple hourly method of prEN 13790: a dynamic method for the
internal shading screens. Appl Therm Eng 2007;27:528–38. future. Clima 2007 WellBeing Indoors. Helsinki: Finland; 2007.

12
K.E.A. Ohlsson and T. Olofsson Renewable and Sustainable Energy Reviews 142 (2021) 110842

[100] Zakula T, Bagaric M, Ferdelji N, Milovanovic B, Mudrinic S, Ritosa K. Comparison [102] Kristensen MH, Petersen S. Choosing the appropriate sensitivity analysis method
of dynamic simulations and the ISO 52016 standard for the assessment of building for building energy model-based investigations. Energy Build 2016;130:166–76.
energy performance. Appl Energy 2019;254:113553. [103] Michalak P. The development and validation of the linear time varying Simulink-
[101] Atmaca M, Kalaycioglu E, Yilmaz Z. Evaluation of the heating & cooling energy based model for the dynamic simulation of the thermal performance of buildings.
demand of a case residential building by comparing the national calculation Energy Build 2017;141:333–40.
methodology of Turkey and EnergyPlus through thermal capacity calculations. [104] JCGM 200. International vocabulary of metrology - basic and general concepts
Energy Systems Laboratory, Texas A&M University; 2011. and associated terms (VIM). 2012.

13

You might also like