WK2 - Steps of Simulation Study

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

SSCM4833

Discrete Event Simulation


STEPS IN
SIMULATION
STUDY
 Refer to Figure 1 in article
“Introduction to Simulation” by

Group Jerry Banks.

Assessment  Define and describe each steps


involved in simulation study.
 Share your findings to the
class.
STEPS IN
SIMULATION
STUDY
STEP 1:
PROBLEM
FORMULATION

▪ Problem must be clearly stated and understood by both sides


- simulation analyst and policy maker.

▪ The policy maker must understand and agree with the


formulation.
STEP 2:
SETTING OF
OBJECTIVE & PROJECT
PLAN
▪ Objective indicates the questions that are to be answered by
the simulation study.

▪ The objectives will determine:


 The best system among several alternatives
 Investigate the system behaviour and performance.

 Common objectives or goals:


 Increasing productivity in existing production systems.
 Assist in decision-making in new facilities investments
 Sizing process inventories
 Analysis of material flow
 Sizing manpower
 Continuous improvement of the production process.
STEP 2:
SETTING OF
OBJECTIVE & PROJECT
PLAN
▪ Project plan will identify scenarios to be investigated.

▪ the plans that should be indicated are:


 The time requirements
 Personnel that will be used
 Stages in the investigation and output at each stage
 Hardware and software requirements
 Cost of the study.
STEP 3:
MODEL
CONCEPTUALIZATION

▪ Starts with simple model development and build toward


greater complexity.
▪ Ability to invent or formulate an idea/concept.
▪ Occurs in the beginning of a design activity.
▪ Draft the project with listed features and requirements.
▪ Not necessary to have one-to-one mapping between the
model and the real system. Only the essence of the real
system is needed.
▪ Advisable to involve the model user to enhance the quality of
the model and increase the confidence in the user.
STEP 4: DATA
COLLECTION DEFINITION

A process of collecting data from


different sources.

Gathering appropriate information


related to case study/project
STEP 4: DATA
COLLECTION STEPS IN
COLLECTING DATA

5. Analyze
the data.
4. Record and
organize the
3. Collect data
data

2. Decide how
to collect data.

1. Clarify data
collection goals.
STEP 4: DATA
COLLECTION CLARIFY DATA

What problem are we trying to solve by


collecting this data?

Independent Dependent
Variables Variables

Condition Program, Changes as a


Behavior,
implemented to method, result of the
outcome, or
see if it will system, or other independent
other condition
create change action variable
STEP 4: DATA
COLLECTION TYPES OF DATA

PRIMARY SECONDARY
Questionnaire Books, journals, newspaper
Interview
Internet, websites, online
materials
Observation

Experiment CD/DVD
STEP 4: DATA
COLLECTION CRITERIA IN
COLLECTING DATA

Pilot Test

Reliability Sample
of data Size

Duration
Protocol
of
Needs
Sample
STEP 4: DATA
COLLECTION ORGANIZING THE
DATA

Graph Statistics

Visual
Qualitative
Inspection
STEP 4: DATA
COLLECTION ANALYZING THE
DATA

QUANTITATIVE QUALITATIVE

• Numbers/Mathematically • Descriptions/Quotes
• Rates/Duration • Interpretation of physical
• Test scores • Opinion/Suggestions
• Number of percentages
• Rating of satisfaction
STEP 4: DATA
COLLECTION WHY ANALYZE ?

Show any significant changes in the variables.

Show connection between factors that might


affecting the results.

Provide credible evidence to show the


program is successful.

Evaluation helps to improve works.

Implement successful methods and


approaches.
STEP 5:
MODEL TRANSLATION

▪ Translate the system into computer model that can be used to


generate experimental data.

▪ Criteria in choosing the Simulation Program


 Preference – Programming language
 Advance in Computer Hardware – Graphical simulation packages
 Software cost

 Example of the simulation packages


 Arena
 Java
 Simpak
 Simban
 C++
 Simul8
STEP 6:
VERIFICATION

▪ process of ensuring that the model design (conceptual model)


has been transformed into a computer model with sufficient
accuracy.
▪ building the model right.
▪ answering question
 Is the model implemented correctly in the simulation
software?
 Are the input parameters and logical structure of the model
represented correctly?
STEP 6:
VERIFICATION
TECHNIQUES

•Consists of including additional checks and outputs in a


Anti-bugging model that may be used to capture bugs if they exists.
•To check the behaviour of the model

Structured walk- •Explaining the model to the others can make the modeller
through/one step focus on different aspects of the model and therefore discover
problems with its current implementation
analysis

•Sometimes, this technique is possible to reduce the model to


Simplified models its minimal behaviour

•The presence of the random variables make it hard for


Deterministic models modeller to reason about the behaviour of the model.
•Can try replacing with deterministic values.
STEP 7:
VALIDATION

▪ the process of ensuring that the model is sufficiently accurate


for the purpose at hand.
▪ building the right model.
▪ involve iterative process.

Build a model that Validate model Compare the model


has high face assumptions input-output
validity transformation

• the created model •How system works •Hypothesis testing


is reasonable and •Data assumptions
understandable to
the policy makers.

Steps in validation
STEP 9:
EXPERIMENTAL
DESIGN

▪Determine the alternatives that are to be simulated.


▪The goal: to get a maximum information with minimum
number of experiments.
▪Common mistakes in experimentation:
 The variation due to experimental error is ignored
 Important parameters are not controlled
 Effects of different factors are not isolated
 Simple one-factor-at-a-time designs are used
 Interactions are ignored.
 Too many experiments are conducted.
STEP 10:
PRODUCTION RUN
& ANALYSIS

▪Estimate measures of performance for the system designs


that are being simulated.
STEP 11:
DOCUMENTATION
& REPORTING

So that the program can be a


reference in the future to the
same or different analysts.

Easier for user to change


parameters in an effort to learn To understand how the
the relationships between program operates.
input parameters and output
measures of performance.

The importance
of program
documentation

If the program is needed to be Create confidence in the


modified in the future. program so that the users can
make decisions based on the
analysis.
STEP 11:
DOCUMENTATION
& REPORTING

Provide the written history of


a simulation project

Provide comprehensive The importance


record of accomplishments,
of progress Gives chronology of work
change requests, key
done and decisions made
decisions and other important documentation
issues.

Keeping analysts and


policymakers aware of any
issues
STEP 12:
IMPLEMENTATION

▪The success of the implementation phase depends on how


well the previous steps have been performed.
CONCLUSION
CONCLUSION

▪ Efficiency is critical point to a simulation program.


▪ Simulation is valuable and should be part of every
manufacturing system design process
▪ Simulation makes you think about a stochastic world, which
is reality .
▪ Too often, simulation is done only when a problem is
discovered with a system that is already installed and hard to
change

You might also like