NSSE 2011 RMall ModelBasedTesting

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 100

Model-Based

Software Testing

Dr. RAJIB MALL


Professor
Department Of Computer Science &
Engineering
IIT Kharagpur.
19-Jan-2011 1
Plan
• A few basic concepts
• Our work
 State Model-based system testing
 Model-based regression test case
selection
 Model-based test coverage analysis
• Future directions
19-Jan-2011 2
Model-Driven Development
(MDD)
• Shifts focus from writing
Requirem Mostly
code to developing models: ents text

 Develop a platform
Analysis PIM
independent model
 Select the specific target
Low-level
platform design PSM

xMDD Process
 Tool generates the code.
Coding Code
“Design once, run anywhere”
• Model-based testing is Testing Mostly
text
gaining popularity in the
context of MDD. Deploym
ent 33
What is Model-Based
Testing?
• An explicit program model is used in
various testing activities:
 Design test cases
 Test coverage analysis
 Regression test selection
 Easy to automate.
• Models with widely varying
abstraction levels have been used.
19-Jan-2011 4
Traditional Testing
specification Test
Design cases

correct w.r.t.
specification ? Run
Pass/fail

SUT

19-Jan-2011 tester 5
Model-Based Testing
Model

Derive
Generate

Test Pass Report


Oracle Fail
Test Cases

Run Result

Implementation
19-Jan-2011 6
19-Jan-2011 7
Are The Two Approaches
Entirely Different?
• Not really!
 Good testers unconsciously construct
a mental model of the software they
test.
 For example, path coverage with
respect to a model.
• MBT just requires to formally
define the model.
19-Jan-2011 8
Activities in Model-Based
Testing
• Model-Definition
• Model construction
• Generation of tests
• Execution
• Evaluation of results
19-Jan-2011 9
Why Model-Based Testing?

… let’s check how to test a


calculator …
19-Jan-2011 10
Let’s Start with the
Simplest…

1. NotRunning

Start Stop

2. Running
…And Get a Bit More Complex
1. NotRunning Standard

Start
Start Stop
Stop

2. Running Standard SelectStandard

SelectScientific SelectStandard

SelectScientific
3. Running Scientific

Stop
Stop Start
Start

4.Not Running Scientific


And Even More Complex
1. NotRunning Standard Empty

Stop Start

Stop 2. Running Standard Empty ClearDisplay


SelectStandard

EnterNumber ClearDisplay
SelectScientific
4. Running Standard NonEmpty SelectStandard

SelectStandard
SelectStandard
SelectScientific
3. Running Scientific Empty SelectScientific
ClearDisplay
EnterNumber
ClearDisplay
SelectScientific
6. Running Scientific NonEmpty
Stop Start
Stop

5. NotRunning Scientific Empty


Abstraction Levels of Models
Used in Testing
• Starting with the informal
requirements specification:
 Models with increasing levels of
details are constructed and used.
• A lower-level model:
 Implements the higher-level model.
19-Jan-2011 14
Abstraction Levels of
Models Used in Testing
• Each type of model has its own
advantages and disadvantages.
 When used for testing applications.
• Testing based on intermediate
models:
 Called grey-box testing.
 In between black-box and white-box
testing.
19-Jan-2011 15
Modeling Maturity Levels
• Level 0: No specification
• Level 1: Textual (very informal)
• Level 2: Text with Diagram (textual
spec is augmented by diagrams)
• Level 3: Models with text
• Level 4: Precise models
• Level 5: Models only (model-to-code)
19-Jan-2011 16
Models Types Used in
Testing
• Two broad types of models
have been used:
Structural models
Behaviorial models

19-Jan-2011 17
Structural Models
• Dependency-based:
 Data dependency
 Control dependency
• Flow-based:
 Control flow
 Data flow
• Component relation-based:
 Association
 Inheritance
 Polymorphism
 Aggregation/composition
19-Jan-2011 18
Bahaviorial Models
• State models:
 FSM, Statecharts
 Decision tables, decision trees
 Simulink/State flow (SL/SF)
 Grammars
 Petrinets, Markov chains, etc.
• UML models:
 Sequence diagrams
 Activity diagrams, etc.
19-Jan-2011 19
Model-Based Testing versus
Code-Based Testing
• Which one would help achieve
a more thorough testing?
• By how much?

• Why?

• Complementary approaches!
19-Jan-2011 20
Why Model-Based Testing?
• More efficient:
 Efficient algorithms for graph
processing exist as compared to text
processing.
 Design of test cases, test coverage
analysis, regression test suite selection,
etc.
• Model-based test cases remain valid
upon code changes.
19-Jan-2011 21
Why Model-Based Testing?
Cont…

• Some aspects of program


behavior:
 Very difficult to test based on
code alone.
• Early exposure of ambiguities in
specification and design.

19-Jan-2011 22
Model-Based Testing: Cons
• Complexity of model construction
• Code changes can make models
outdated.
 Implicit assumption full traceability of
model and code.
• Detailed behavior cannot be tested
using models:
 Models are abstractions after all.
19-Jan-2011 23
Example Program Aspects
Difficult to Test From Code
• States and transitions:
 Classes often have non-trivial state models.
 Difficult to extract state model from code
analysis.
NOT NOT
RUNNING RUNNING
ANALOG DIGITAL

Start Stop
Stop Start

• Message paths: RUNNING


ANALOG
Digital

Analog
RUNNING
DIGITAL

Analog

 Several message path sequences possible.


Digital

 Only a few may practically be meaningful.


19-Jan-2011 24
Model-Based Testing Specially
Important for OO Programs
• Methods tend to be small and well
understood.
• Complexity tends to move from
intra-class to inter-class aspects.
 Interaction between methods (data-flow)
 Class relationships (association, inheritence,etc.)
 Object states and transitions
 Exception behavior
19-Jan-2011 25
Overall Approach of Model
Based Testing
Test
Main Steps Requirements Plan

1) Model the SUT 1. Model

2)Generate abstract Model

tests from the model. Test Case


Requirements
Traceability
Matrix
Generator

3)Concretize abstract
2. Generate

Model
Test Cases Coverage

tests to make them


executable 3. Concretize
Test Script
Generator
5. Analyze

4)Execute the tests. Test Scripts


Test
Results

5)Analyze the test


Adaptor

Test Execution Tool 4. Execute

results.
19-Jan-2011
System under
Test 26
Requir 1. Create
ements model
The Model-Based
Model Testing Process
2. Generate Model
abstract test Coverage
cases Criterion

Platform Independent
Test cases Test Test
Scenarios Report

Platform Specific 3. Generate Test 5. Analyze


Test Cases concrete Oracle the result
test cases

Test 4. Run Test


19-Jan-2011 scripts SUT Result 27
Grammars
• Grammars are equivalent to state
machines.
• For certain systems:
 Grammars are more compact
• Generating tests and defining
coverage criteria:
 Not straightforward at present.
19-Jan-2011 28
Markov Chains
• Traditionally software systems have
been modelled using:
 Discrete parameter, finite-state, Markov
chains
• Structurally similar to FSMs:
 Can be thought of a probabilistic
automata
• Have been used to measure reliability
and MTTF:
 Not commonly used for generating tests.
19-Jan-2011 29
State Model-
Based Testing

19-Jan-2011 30
FSM Model of a Simple
Clock

19-Jan-2011 31
Operational Modes
•Two modes of operations:
System mode
•NOT RUNNING
•RUNNING
Setting mode
•ANALOG
•DIGITAL
19-Jan-2011 32
FSM Model for A Clock
NOT NOT
RUNNING RUNNING
ANALOG DIGITAL

Start Stop
Stop Start

RUNNING Digital RUNNING


ANALOG DIGITAL
Analog
Analog Digital
19-Jan-2011 33
Test Scenarios
Random path:
•Start
•Analog
•Analog
•Analog
•Analog
•Analog
•Stop
Too many random paths exist!
19-Jan-2011 34
Coverage-Based Test
Scenario Generation
• State coverage
• Transition coverage
• Transition path coverage
• Guard coverage

19-Jan-2011 35
Importance of
Statecharts in Testing
•Heavily used to test:
Safety-critical systems
Networking Protocols
GUIs
OOD
19-Jan-2011 36
Features of Statecharts
•More expressive than FSMs.
• Generalization:
–Simplifies by Factorization
• Orthogonality:
–Simplifies by Segmentation
• History state:
–Memorize last visited substate.
19-Jan-2011 37
Statechart Example
Pop-error(“stack empty”) pop[not empty]/delete

push/store
Empty Loaded
pop[empty]/delete

Full

Push-error(“stack full”)
19-Jan-2011 38
Example for Generalization
E1
A B A E1
B
E2 E2
C E2

C
E1
A B

B B
B1 B1
A E1
E3
A E1
E3
B2 B2
19-Jan-2011 39
39
Example of Orthogonality

S
T U
X
E1 A
E3 E1 E4 [in Z]
Y
B
Z
E2

19-Jan-2011 40
Example History State

Washing Rinsing Drying

H
opened door

closed door
Wait

19-Jan-2011 41
Test Scenario Generation
from State Models
Behavior defined by
Graphical Specification complex notation
specific semantics

Notation specific
translation

Intermediate
Specification
General purpose test
generation technique

Test Specifications Explicit specification of the


behavior under test

19-Jan-2011 42
Statechart Spec. and
Test Case Formats
Statechart Specification Format:
Initial_ Next_ Actual_
State Event State Action

Test Case Format:


Seq_of_ Expect_ Expect_
State Events State Action
19-Jan-2011 43
Test Automation
• Test scripts can be generated:
 By simulating application of inputs
by the respective users.
• Nitty-gritty of test automation
tool:
 Scripts for simulated inputs can be
annotated on the arcs of the FSM
19-Jan-2011 44
Test Automation
NOT NOT
• Procedure start(){
 //Simulation code to
RUNNING RUNNING
ANALOG DIGITAL

Start Stop
start analog clock;
Stop
Start }
Digital

 Procedure analog(){
RUNNING RUNNING
ANALOG DIGITAL

Analog
 //Simulation code to
set analog;
Analog
Digital

 }…. … …
19-Jan-2011 45
Evaluation of Test Results
• This is the most difficult activity of
the testing process.
• Can be easily used to check for
crashes:
 How to otherwise check correctness of
results?
• Easier when testing:
 Different versions of the same product
 Competing products
19-Jan-2011 46
Evaluation of Test Results
cont…
• In the absence of a good test oracle:
 Have to settle for plausibility checks.
 Tests are taken to have been passed,
• If the values are within certain ranges or
pass certain consistency checks.
 For state-based models:
• Output is easier to verify if the classes
can be instrumented to output states.
19-Jan-2011 47
Choosing a Model
• What is a suitable model for use in
testing?
 Depends on application characteristics.
• Grammars:
 Mathematical evaluation as in a
calculator.
 HTML files
• Statecharts:
 Systems having concurrent states.
19-Jan-2011 48
Model Use Guidance
Application Characteristics Suggested Modeling
Method

Processes formal language (e.g., Grammar


web browser processes HTML,
compiler)
State-rich systems (e.g., Finite State Machines
telephony systems)

Parallel system, individual Statecharts, Petri Nets


components capable of being
modeled by state machines
Need to represent conditions Decision tables
under which inputs cause a
particular
19-Jan-2011 response/effects, 49
Model-Based
Regression Testing

19-Jan-2011 50
Need for Regression Testing
• Any system during use undergoes
frequent code changes.
 Corrective, Adaptive, and Perfective
changes.
• Regression testing needed after
every change:
 Ensures unchanged features
continue to work fine.
19-Jan-2011 51
Partitions of an Existing Test Suite

To
Obsolete
Tu
Redundant

Tor Tr
Optimized Regression Tests
Regression
Tests

19-Jan-2011 52
Major Regression Testing Tasks
• Test revalidation (RTV):
 Check which tests remain valid
• Test selection (RTS):
 Identify tests that execute modified
portions.
• Test minimization (RTM):
 Remove redundant tests.
• Test prioritization (RTP):
 Prioritize tests based on certain criteria.
19-Jan-2011 53
Why Model-Based RTS?
• RTS approaches traditionally
code analysis-based:
Inefficient and time consuming
approach.
The regression test suite is
often unsafe.
19-Jan-2011 54
Model-Based Test
Coverage Analysis

19-Jan-2011 55
Uses of Test Coverage
Analysis
• Identify test adequacy :
 Find areas of programs not
exercised by test cases
 Create additional test cases to
increase coverage
• Identify redundant test cases:
 Do not help to increase coverage.
19-Jan-2011 56
Coverage Metrics
• State Coverage
 Ratio of number of states covered and total number of
states in the given state model
• Event Coverage
 Ratio of number of events covered and total number of
events in the given state model
• Transition Coverage
 Ratio of number transitions exercised and total number
of transitions in the given state model
• State-Event Coverage
 Ratio of state-event pairs exercised and number of
states multiply number of events in the given state model
19-Jan-2011 57
Our Work on State-
Based System Testing
Student: Monalisa Sarma

19-Jan-2011 58
System State Model
Generation
• System state model:
 Extremely complex to construct manually.
 Usually has thousands or millions of
states and transitions.
 Rarely constructed by designers.
• What really is constructed during a
design process:
 State model of classes
19-Jan-2011 59
Can State-Model of
Classes be Used?
• State of a system is defined:
 In terms of states of individual objects.
• Arbitrary object state combinations:
 May not be feasible.
• Arbitrary use case invocation sequences
can be considered:
 To determines the feasible system states
and transitions.
19-Jan-2011 60
State Model Construction Through
Arbitrary Use Case Invocations
• A sequence diagram:
 Implements a use case
 Shows the sequence in which methods of
different objects are invoked.
• Class state model:
 Shows state transitions upon method
invocations.
• Can be used to find system states and
transitions.
19-Jan-2011 61
An
Example
System
State
19-Jan-2011 Graph 62
Our Work on Model-
Based Regression
Test Selection
Students: Mayank Mittal, K.S.
Vipin Kumar, Swarnendu Biswas

19-Jan-2011 63
Dependency Models for Programs
• Dependency models for procedures:
 Proposed by Ferrante et el. (1987)in
the context of program slicing (PDG).
 Extended to handle complete programs
(SDG) by Horwitz (1990).
 Extended to represented object-
oriented programs (ClDG) by (Larsen
and Harrold)
19-Jan-2011 64
System Dependence Graph (SDG)
void main()
{
int i = 1; int sum = 0;
while (i<11) {
sum = add(sum, i);
i = add(i, 1);
}
printf("sum = %d\n", sum);
printf("i = %d\n", i);
}
static int add(int a, int b)
{
return(a+b);
}

19-Jan-2011 65
Models for Embedded Programs
• Embedded program models need to
represent additional information:
 Timing
 Criticality
 Concurrency, etc.
• These issues are associated with
threads.
 Requires representing control flow
information.
19-Jan-2011 66
Incorporation of Control
Flow in ClDG
• Representation of threads :
• Control flow edges are
incorporated
• Explicitly specify ordering of
statements.

19-Jan-2011 67
Incorporation of Control Flow in ClDG
Sample C++ program

19-Jan-2011 68
Incorporation of Control Flow in
ClDG
ClDG for
Calculator CE1
Class Data dependence

Control
E2 dependence

a = b = b_in c_out=c
a_in

S4

19-Jan-2011 69
ClDG for the entire program
E6 Data dependence
Control dependence

S7 S8 S9 S1 S13 S14
0

S12
S11

sum = a_in = i b_in = i = c_out


a_in = sum b_in =
c_out 1
i

E2

c_out = c
a = a_in b = b_in

S4

19-Jan-2011 70
ClDG for the entire program augmented with control
flow
Data dependence
E6 Control dependence

S7 S8 S9 S1 S13 S14
0

S12
S11

sum = a_in = i b_in = i = c_out


a_in = sum b_in =
c_out 1
i

E2

c_out = c
a = a_in b = b_in

S4

19-Jan-2011 71
ClDG augmented with Exception Handling

19-Jan-2011 72
Augmenting ClDG with
Information from UML Models
• Representing method sequences.
• Augmenting ClDG with priority and
timing information.

• Representing state model.

19-Jan-2011 73
control dependency

Method-sequence
data dependency
control flow
E24 method sequence edge
<mm1>

Norma
Norma l Exit
l
Return
S25 C26
Exception
al Return

E2

Norma
l Exit
Norma
S3 l S12
S4 S5 S6 Return S8

Exception <mm1>
C7 al Return

S9 S10

y_in = a x_in = x x= x_out

S11

E14

y = y_in x =x_in x_out = x

Norma
l Exit

S15 S16 S18 S19 S22


Exception
al Exit

19-Jan-2011 S17 74
S20 S21
Augmenting ClDG with Priority
and Timing Information
•A thread is the basic unit of
concurrency.
•A method is selected to run as thread.
•A method may invoke other methods:
•The entire sequence of methods runs as a
single thread.
•Priority information:
•Associated with the start method node of the
thread in the EClDG.
19-Jan-2011 75
Augmenting ClDG with Object
State Information

• Represented in the form of a state


transition table.

• State information is stored in the


class entry node of each class

19-Jan-2011 76
Augmenting ClDG with Object
State Information
Finite state model for a Stack

Empty Full
e<n, push

e=1, pop e=n-1, push e>1, pop

Partially
full
e>1, pop
e<n-1, push
19-Jan-2011 77
Augmenting ClDG with Object State
Information
State transition
State transition table
table for for
the
the stack
stack

old-state condition operation new-sate


empty e < n push partially-full
empty pop ND
partially-full e > 1 pop partially-full
partially-full e < n-1 push partially-full
partially-full e = 1 pop empty
partially-full e =n-1 push empty
full e >1 pop partially-full
full push ND

19-Jan-2011 78
Regression Test Selection
• Augmented models can be effectively
used to select test cases.
• How to identify relevant test cases?
• Steps:
 Construct ClDG models for the original
(P) and the modified (P’) program.
 Augment the models for P and P’ with
information available from design
models.
19-Jan-2011 79
Regression Test Selection
 Identify the changes between P and P’.
 Mark the changes on M’.
 Generate trace information for the test
cases.
 Augment the model with the test trace
information.
 Slice M’ with each marked point of change
as the slicing criterion.
• Relevant test cases due to data and control
dependence edges are selected (TCD).
19-Jan-2011 80
Regression Test Selection
 Analyze the control flow graphs of the
modified tasks.
 Select regression test cases based on
control flow analysis (TCF).
 Final set of relevant test cases:

T  TCD  TCF

19-Jan-2011 81
Regression Test Selection
Original
Schematic
Program
(P)
Program-based Program- Augment with
Model based design model
Constructor Model information
Modified
Program
(P’)
Marked Intermedi-
Intermediate Map Code
Model Changes ate Model
Test Trace
Information

Slicer
Model marker
with Relevant
test case info Test Cases
Select test
cases based on
control flow
19-Jan-2011 82
RTS Example
• Original Program • Modified Program
int main( void ) int main( void )
{ {
int a; int a;
int b; int b;
int c; int c;
a = 20; a = 20;
b = 20; b = 20;

// executable // executable
a=b+1; a=b+1; //MODIFIED LINE
if(a>10&&a<90) if(a>10&&a<90)
{ {
a=10; a=11;//MODIFIED LINE
} }
else if(a<0){ else if(a<0){
a=2; a=2;
b=5; //DELETED LINE
} }
else{ else{
b=0; b=2;//MODIFIED LINE
} }

return 1; return 1;
} }
19-Jan-2011 83
83
RTS Example
Model for the Original Model for the Modified
Program Program
PE PE PE
True
True True

main main main

True True True

True True True True True a – Line True True True True
b –aLine 4 3 c – Line 5 b – Line 4 Exp – c – Line 5 b – Line c – –Line
Exp Line 6 Exp –
a – Line 3 – Line Line 6 3 4 5 Line 6

True True True

IF – Line True
Exp – True
Exp – True IF – True True
Exp – True
Exp –
11 Line 10 IF – Line 11
Line 7 Exp – Line 10 Line 11 ExpLine
– Line
10 7 Line 7
Affected
True
False
True
False
True
False

Exp – elements Exp –


IF – Line
Exp – Line 13 IF –
Line 13
determined by 15 IF – Line 15 Line 13
Line 15
slicing True True True Modified
Node
Exp – False
Modified False Exp – False
Line 16 Exp – Line 16 Line 16
True
True
Node True

True True
Exp – True Exp –
Line 20 Exp – Line 20 Line 20
Exp – Exp –
Line 17 Exp – Line 17 Line 17

True True
True
Deleted True True

Node
True

return – return –
Modified
Line 22 return – Line Line 22 Node
22
True True
True
Control flow main
main exit Data dependence exit
main
Control exit
dependence
19-Jan-2011 84
SDGC Model

85
Regression Test Optimization
• Number of selected test cases may
still be very large :
• Solution: Optimize selected test
cases on the basis of
• Cost: Select test cases to minimize the
cost of regression testing.
• Frequency of Execution: Select test
cases to thoroughly test the
frequently-executed functionalities. 86
19-Jan-2011
Combinationally Redundant
Test Cases

A X
T1 = (A, B, C)
B T2 = (X, B, Y)
T3 = (A, B, Y)

Y C

87
Regression Test Optimization
• Optimization Constraint:
 Test cases which execute the critical
functionalities should not be omitted.
• Test suite optimization of is a multi-
objective optimization problem
 Evolutionary algorithms like Genetic
Algorithms can be used to find a
pareto-optimal solution.
19-Jan-2011 88
Our Work on Model-
Based Test Coverage
Analysis
Student: ESF Najumudheen

19-Jan-2011 89
Test Adequacy Criteria for
Procedural Programs
• Several coverage criteria have
been proposed:
• Statement coverage
• Branch coverage
• Condition coverage
• Function coverage
• Def-Use coverage, etc.
19-Jan-2011 90
Coverage Criteria for
OO programs
• Coverage criteria developed for procedural
programs is inadequate for OO programs:
 Focus mostly on code structure or functions
 Features specific to object-oriented
programs need to be considered
Information hiding Dynamic binding
Inheritance Object States
Polymorphism
19-Jan-2011 91
Coverage Criteria for OO
Programs
• High traditional coverage:
 Does not indicate that an OO
program has been adequately
tested.
 Can be misleading
 Sometimes, it may even lead to a
false confidence.
19-Jan-2011 92
Issues in Test Coverage
for OO Programs
• Inheritance:
 Re-testing of inherited methods is generally
recommended
 All inherited methods need not be re-tested
• Software under test may not use all of them
• Polymorphism:
 Dynamic bindings that would be actually
occur have to be identified
 Testing of all possible combinations of
bindings is impractical
19-Jan-2011 93
Existing Approaches
• Approaches reported in the
literature:
 Source code based
 Object code / Bytecode based
 Flow-graph based
• Control-Flow / Data-Flow graph
• Class Flow graph
19-Jan-2011 94
Shortcomings of Existing Approaches
• Difficult to analyze OO features:
 Using only source code or bytecode.
• Flow-based representations:
 Effective in determining structural
aspects of the code:
• Independent paths, logical decisions, etc.
 Cannot capture OO dependencies like
inheritance, association, etc.
• Use of dependence models:
 No research has been reported.
19-Jan-2011 95
Overview of Our
Approach
• Construct model.
• Instrument code
• Execute all test cases:
 For each statement execution,
corresponding model elements are
marked.
• Analyze model
19-Jan-2011 96
Coverage Metrics
Computed
• Exception coverage
• Message path coverage
• Polymorphism coverage
• Inheritance coverage
• Association coverage
• State coverage
• Transition path coverage
19-Jan-2011 97
Example Marked Model
Test
18

19
ec.11

p6 ec.14 ec.19 ec.13 ec.15 ec.12

22 27 21 20 23

em1.03 ec.16 ec.17


em5.06
em1.02

07 24 25
ec.04 06
ec.05 ec.18
08
em1.04
A B 26
09
01 05
p7
em4.07 11
em1.05
em2.01
02
em3.08 em4.09 03 15
ec.01 ec.02
ec.03
p1 p2
04 10 14 D
C
12 16

ec.06 ec.07 ec.10 ec.09


ec.08
p3 p4 p5
17
13

19-Jan-2011 98
A Critique of MBT
• Model-based testing clearly is not the
testing panacea.
• However can :
 Reduce testing cost
 Increase effectiveness of testing
 Shorten test cycle
• Model-based testing especially effective:
 For software that frequently change or
have many versions.
19-Jan-2011 99
Research Challenges:
Methods and Tools
• Along with fundamental research on
model-based testing:
 Tool development is also needed.
• An important requirement:
 Methods and tools be scalable to large
systems.
 Many tools and methods developed so
far work only on toy systems.
19-Jan-2011 100

You might also like