Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 20

School of Industrial Engineering & Management - International University – VNU-HCM SIMULATION

Chapter 10

Verification and Validation


of Simulation Models

Discrete-Event System Simulation


Purpose & Overview

2
Purpose & Overview

3
Purpose & Overview

• The goal of the validation process is:


– To produce a model that represents true behavior closely
enough for decision-making purposes
– To increase the model’s credibility to an acceptable level

• Validation is an integral part of model development


– Verification – building the model correctly (correctly
implemented with good input and structure)
– Validation – building the correct model (an accurate
representation of the real system)

• Most methods are informal subjective comparisons while


a few are formal statistical procedures
4
Model Building, Verification &
Validation

5
Fig. 10.1 Model Building, Verification and Validation
I. Model Building
1. Observing the real systems and the
interactions among various components;
collecting data on its behavior
(various components can be operators,
technicians, engineers, supervisors,
manager)

6
Fig. 10.1 Model Building, Verification and Validation
I. Model Building
2. Construction of conceptual model
- Collection of assumptions on the
components and structures of the
systems, plus hypothesis on the values of
model input parameters, illustrated by the
following picture

7
Fig. 10.1 Model Building, Verification and Validation
I. Model Building
3. Translation of the conceptual model into a
operational model in a computer
recognizable form/the computerized
model
https://caic-rt.shinyapps.io/CAIC-RT/?fbclid=
IwAR029Ij4VTD8oZgLyLbnrPgzXgxzhjeMx5
ATKBO3NaXuG5-ZJuJ4gZ7seNY

8
Fig. 10.1 Model Building, Verification and Validation
II. Verification of Simulation Models
• Purpose: ensure the conceptual model is reflected accurately in the
operational model (the computerized representation)
• The conceptual model quite often involves some degree of
abstraction about system operations.
• Many common-sense suggestions, for example:
– Have the model, computerized representation checked by someone other
than its developer
– Make a flow diagram that includes each logically possible action a system can
take when an event occurs.
– Closely examine the model output for reasonableness under a variety of input
parameter settings. (Often overlooked!)
– Print the input parameters at the end of the simulation, make sure they have
not been changed inadvertently.
– …

9
II. Verification of Simulation Models
• Many common-sense suggestions, for example: (Cont.)
– Make the computerized representation of self-documentation
– If the computerized representation is animated, verify that what
is seen in the animation imitates the real system
– Graphical interfaces are recommended for accomplishing V&V
– …

10
II. Verification of Simulation Models
• Many common-sense suggestions, for example: (Cont.)
– Even the best of simulation analysts makes mistakes or commits
logical error when building a model so a Interactive Run Controller
(IRC) or Debugger can be used which is an essential component
of successful simulation model building
The IRC assist in finding and correcting those errors in the follow
ways:
a) The simulation can be monitored as its progress
b) Attention can be focused on a particular line of logic or multiple
lines of logic that constitute a procedure or a particular entity
c) Value of selected model components can be observed. When the
simulation has paused, the current value or status of variables,
attributes, queues, resources, counters, etc., can be observed
d) The simulation can be temporarily suspended or paused, not only
to view information but also to reassign value of redirect entities

11
Examination of Model Output for Reasonableness
[Verification]
• Example: A model of a complex network of queues
consisting many service centers.
– Response time is the primary interest, however, it is important to
collect and print out many statistics in addition to response time.
• Two statistics that give a quick indication of model reasonableness are
current contents and total counts, for example:
– If the current content grows in a more or less linear fashion as the
simulation run time increases, it is likely that a queue is unstable
– If the total count for some subsystem is zero, indicates no items entered
that subsystem, a highly suspect occurrence
– If the total and current count are equal to one, can indicate that an entity
has captured a resource but never freed that resource.

• Compute certain long-run measures of performance, e.g. compute the


long-run server utilization and compare to simulation results

12
Other Important Tools [Verification]

• Documentation
– A means of clarifying the logic of a model and
verifying its completeness

• Use of a trace
– A detailed printout of the state of the simulation
model over time.

13
III. Calibration & Validation of Models
• Verification & Validation although are conceptual distinct,
usually are conducted simultaneously by the modeler
The relationship of model calibration to the overall
validation process

14
III. Calibration & Validation of Models

15
Calibration and Validation of Models
• Validation: the overall process of comparing the model and its
behavior to the real system (and its behavior).

• Calibration: the iterative process of comparing the model to the real


system and making adjustments (to the model, comparing again and
so on)
• .

16
Fig. 10.3 Iterative process of calibrating a model
III. Calibration & Validation of Models
The comparison of the model to reality is carried out to
variety of tests:
•Subjective test usually involve people who are
knowledgeable about one or more aspects of the system,
making adjustments about the model and its outputs
•Objective tests always require data on the system’s
behavior plus the corresponding data produced by the
model

17
Operational Validity Classification
Observable Non-observable
System System
Subjective • Comparison • Explore Model
Approach Using Graphical Behavior
Displays • Comparison to
• Explore Model Other Models
Behavior
Objective • Comparison • Comparison to
Approach Using Statistical Other Models
Tests and Using
Procedures Statistical Tests
18
HW8
1. Two policies for replacing bearing were
compared, estimate difference in the
mean cost per replacing bearing policy
at 95% of confidence level

19
HW8
2. In simulation: what is
•Conceptual Model
•Calibration
•Verification
– Give one example for each
– Listing of steps

20

You might also like