Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Rak-43.

3520
Fire simulation
9. lecture 16.3.2016
Simo Hostikka
Contents

7. Exercise feedback
8. Exercise feedback
Verification and validation (V&V)
9. Exercise
Verification and validation (V&V)

• Convergence and validity


• Quality assurance
• Definition of verification
• Verification process and repository
• Definition of validation
• Validation process and repository
• Simulation uncertainty
• Experimental uncertainty
• Estimation of model uncertainty
• Application of model uncertainty

3/16/2016
3
Convergence and validity

A numerical method should converge towards a ”exact solution”,


when the numerical discretization is refined. The fundamental
theorem of numerical analysis says that convergence requires
1. consistency (numerical equation approaches the
originaldifferential equation when Dx ® 0 and Dt ® 0), and
2. stability (numerical solution is bound at all times)

Validity of a simulation code consists of many other aspects besides


convergence, such as:
Data, default values, documentation, coding errors,
constitute models, ….

3/16/2016
4
FDS Quality Assurance Procedure

• Quality assurance includes


• Description of the development process, inc. decision making,
• Version management of the source code (git)
• Continuous control of errors (verificaiton)
• Model uncertainty estimation (validation)
• Reporting and tracking of code errors and feature requests
(Issue tracker)
• Version control of documentation
FDS quality assurance procedure is described in
FDS Tech. Ref. Guide, Vol 4. Configuration Management

3/16/2016
5
Verification and validation definitions

Available in several sources, including


• ISO 16730:2008: Fire safety engineering -- Assessment,
verification and validation of calculation methods
• American Society for Testing and Materials ASTM E1355-12:
Standard Guide for Evaluating the Predictive Capabilities of
Deterministic Fire Models, 2012. v, 1

3/16/2016
6
Verification definition

• ASTM E 1355: ”The process of determining that the


implementation of a calculation method accurately represents the
developer’s conceptual description of the calculation method and
the solution to the calculation method.”
• Making sure that the equations and calculation procedures are
implemented as they are documented.
• Are we solving the problem right?
• Is our math correct?
• Avoiding coding errors

3/16/2016
7
Verification process

Program development stages


• Local cycle of {code change, compile, test}
• Development of new models may require new tests.
• Sending (committing) the change into the public repository
(github)
• Firebot-runs at NIST server
• Compilation
• Verification tests
• Generation of figures
• Compilation of documentation
• Reporting errors, fix, -loop

3/16/2016
8
Verification suite

• Verification tests are fast (1 s … 1 h CPU time) cases that have


some sort of ”exact solution”.
• Analytical solution
• Result from another method
• Trivial result from e.g. mass or energy conservation.
• Currently about 650 cases.
• These cases do not represent typical applications, and they
should not be used to learn the correct use of FDS.

3/16/2016
9
Types of verification tests

• The Basic Flow Solver


• Thermal Radiation
• Species and Combustion
• Heat Conduction
• Pyrolysis
• Lagrangian Particles
• HVAC
• Unstructured Geometry
• Outputs

3/16/2016
10
Example

3/16/2016
11
Definition of validation

• ASTM E 1355: “The process of determining the degree to which a


calculation method is an accurate representation of the real world
from the perspective of the intended uses of the calculation
method.”
• Process for determining how well the model reproduces the
phenomena we want to simulate.
• Are we solving the right problem?
• Is the physics right?

• Usually the validation is carried out by comparing against


experimental results.

3/16/2016
12
Levels of validation

Blind Calculation: Only the basic information is provided.


Corresponds to the real application situation. Provides
information about the accuracy of the simulation process,
including the competence of the person (user effect). Model
uncertainty cannot be distinguished.
Specified Calculation: Initial and boundary conditions are given,
not results.
Open Calculation: Both input data and results are provided. Can
most effectively separate the model and user contributions.
Requires transparency and ethics for not to ’fine tune’ the model
for the results.

3/16/2016
13
Validation repository

51 experimental campaigns from


various fire laboratories.
About 1000 individual simulations,
to be run always before a release of
a new FDS version.
Experimental data consists of the
world’s largest collection of fire test
data.

3/16/2016
14
Sources of simulation uncertainty
Input uncertainty / parametric uncertainty
• Important
• Can be handled by statistical methods.
• Uncertainty type: aleatory.
Modelling uncertainty
• User effect
Model uncertainty
• Can be estimated through the validation.
• Uncertainty type: epistemic.

3/16/2016
15
Model uncertainty and experimental
uncertainty
The figure shows a comparison of
experimental and simulated data.

Model uncertainty must be


estimated in relation to the
experimental uncertainty.

3/16/2016
16
Experimental uncertainty 1

Experimental uncertainty consists of measurement uncertainty and


associated input uncertainty.
Example:
Gas temperature in room fire depends on HRR through MQH-
correlation

Þ
Thus, a typical HRR uncertainty (7.5 %) leads to about 5 % input
uncertainty in temperature predictions.

3/16/2016
17
Experimental uncertainty 2

3/16/2016
18
Metrics of error
Single point measure

Example:

Integral measures

Audouin et al. Nuclear Eng. Design 241 (2011), 18-31.

3/16/2016
19
Estimation of model uncertainty
Assume that experiment E has
Gaussian distribution around q

sE2 is standard deviation


Model results M

where d is model bias.

The relative standard deviations:

3/16/2016
20
Example: Cable temperature 1

3/16/2016
21
Example: Cable temperature 2

Critical temperature Tcr = 200 °C


Simulation: T = 175 °C
Q: What is the probability that
T ³ 200 °C?
We know bias d = 1.06 and
relative std. deviation s~M = 0.16
Real cable temperature has Gaussian distribution:
M 155 M 155
m = 20 + = 20 + = 166 °C s = s~M = 0.16 ´ = 23.4 °C
d 1.06 d 1.06

1 æT - m ö 1 æ 200 - 166 ö
Þ P(T > Tcr ) = erfcç cr ÷ = erfcç ÷ = 0.07
2 è s 2 ø 2 è 23. 4 2 ø
Excersize 9: Using the validation
metric
1. Choose one of the excersizes we have had in the course that
has a quantitative simulation result, such as a DEVC.
2. Find the model uncertainty parameters d and s~M
from the
validation guide that correspond to your choise.
3. Choose a fictitious critical value for the quantity you
predicted.
4. Use the method shown on the previous slide to estimate
the probability that your critical value is exceeded.
Report your choises and calculations.

3/16/2016
23

You might also like