Instroduction

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

Introduction to Testing

Wei-Tek Tsai
Department of Computer Science and
Engineering
Arizona State University
Tempe, AZ 85287

Testing Definition
• The process of operating a system or
component under specification conditions,
observing and recording the results, and
making an evaluation of some aspect of the
system or component (IEEE).

1
Testing vs. Debugging
• Debugging: the act of attempting to determine the cause of
the symptoms of malfunctions detected by testing or by
[frenzied] user complaints [Boris 1990]
• Testing: the process of establishing confidence that a
program or system does what it is supposed to [Hetzwl,
William 1973]
• Testing: the process of executing a program or system with
the intent of finding errors [Myers, 1979]
• Testing: any [authors opinion] activity aimed at evaluating
an attribute or capability of a program or system and
determining that it meets its required results[Hetzel, Bill
1988]
3

Testing vs. Debugging


• The last definition “includes” people reviews
• The purpose of testing is to show that a program
has bugs [Hetzel, Bill 1988]
• The purpose of debugging is to find the error or
misconception that leads to the program’s failure
and to design and implement the program changes
that correct the error

2
Types of Testing
• Formal vs. Informal
– Formal:planned tests, cases & harnesses retained,
results tracked;”Process of conducting testing activities
and reporting test results in accordance with an
approved test plan”[Hetzel, 1988]
• Levels of testing
– Unit testing
– Integration testing
– System testing
– Acceptance testing

Functional Testing
• Testing that ignores the internal
mechanisms of a system or component and
focuses solely on the outputs generated in
response to selected inputs and external
conditions

3
Structural Testing
• Testing that takes into account the internal
mechanism of a system or component
• Types of structural testing:
– Branch testing
– Path testing
– Statement testing

Brach Testing
• Testing designed to check the outcome of
execution of each decision point in a
computer program

4
Path Testing
• Testing designed to execute all or selected
paths through a computer program

Statement Testing
• Testing designed to execute each statement
in a computer program

10

5
Alpha, Bata Testing
• Testing performed by actual customers at
the developer’s site.
• Testing performed by actual customers at
their site

11

Regression Testing
• Selective testing of a system or component
to verify that modifications have not caused
unintended effects

12

6
Levels of Testing
• Unite testing
• Interface testing
• Integration testing
• System testing
• Acceptance testing

13

Unit Testing
• Testing of individual software components
or groups of related components
• Interface testing: testing conducted to
evaluate whether systems or components
pass data and control correctly to one
another

14

7
Integration Testing, System
Testing
• Testing in which software components or
hardware components or both are combined
and tested to evaluate the interaction
between them
• System testing: testing conducted on a
complete, integrated system to evaluate
compliance with specified requirements

15

Acceptance Testing
• Formal tseting conducted to enable a user,
customer, or other authorized entity to
determine whether to accept a system or
component

16

8
Testing Documentation (IEEE
Std 829)
• Test Plan
• Test Design Description
• Test Case Specification
• Test Procedures Specification
• Test Item Transmittal Report
• Test Log
• Test Incident Report
• Test Summary
17

Test Plan
• A document describing the scope, approach,
resources, and schedule of intended testing
activities.

18

9
Test Design Description
• A document specifying the details of the
testing approach for a software feature by
identifying the associated tests

19

Test Case
• A set of test inputs, execution conditions,
and expected results developed to verify
compliance with a specific requirement or
requirements

20

10
Test Case Specification
• A document specifying inputs, predicted
results, and a set of execution conditions for
a test item.

21

Test Procedure Specification


• A document specifying a sequence of
actions for the execution of a test

22

11
Test Item Transmittal Report
• A document identifying test items by their
current status and location information

23

Test Log
• A chronological record of relevant details
about the execution of tests

24

12
Test Incident Report
• A document reporting on any event that
occurs during the testing process which
requires investigation

25

Test Summary
• A document summarizing testing activities
and results, containing an evaluation of the
corresponding test items

26

13
Configuration Management
• Applying technical and administrative direction
and surveillance to:
– Identify and document the functional and physical
characteristics of a configuration items,
– Control changes to those characteristics,
– Record and report change processing and
implementation status, and
– Verify compliance with specified requirements (IEEE)

27

Measure
• A way to ascertain or appraise value by
comparing to a standard (IEEE)

28

14
Metric
• A quantitative measure of the degree to
which a system, component or process
possesses a given attribute (IEEE)

29

Definition
• Mistake: A human action that produces an
incorrect result (IEEE)
• Fault: An incorrect step, process, or data
definition in a computer program (IEEE)
• Failure: The inability of a system or
component to perform its required functions
within specified performance requirements
(IEEE)

30

15
Definition ()
• Error: The difference between a computed,
observed, or measured value or condition
and the true, specified, or theoretically
correct values or condition (IEC)

31

Verification and Validation


• The process of determining whether the
requirements for a system or component are
complete and correct, the products of each
development phase fulfill the requirements
or condition imposed by the previous phase,
and the final system or component complies
with specified requirements.

32

16
Verification Definition
• The process of evaluating a system or
component to determine whether the
products of a given development phase
satisfy the conditions imposed at the start of
that phase (IEEE)

33

Validation Definition
• The process of evaluating a system or
component during or at the end of the
development process to determine whether
it satisfies specified requirements (IEEE)

34

17
Formal Method
• A mathematically sound approach to system
specification or design that uses logical
inference (proof system) for verification
purposes

35

Formal Methods vs. Testing


• Testing can only be applied after the program has been
finished.
• Formal method is applied before the program has been
conceived
• Testing can only check the program for a finite number of
input values.
• Formal method checks it for ALL values
• Testing an embedded real-time system may never discover
certain execution path, which leads to errors.
• Formal method provides full coverage of such faults in
advance
36

18
Engineering Perspective on
Formal Methods
• From the engineering perspective formal methods
have to fit into the software life cycle and be
compared to S/W development methodologies.
• A formal method involves:
– A notation, such as a specification language, to describe
system’s behavior
– A calculus to analyze and predict system’s behavior,
usually by proof.
– (Hopefully) software tools to assist automating the
proof process.

37

Formal Methods’ Main


Categories
• Formal methods can be roughly categorized
into two groups:
– Model-based (operational approaches)
– Property-based (descriptive approaches)

38

19
Model-based vs. Property-based
• Model-based techniques define the system
in terms of states and transitions.
• Property-based techniques define the
system by means of algebraic and/or logic
equations.

39

Model-based Examples
• Model-based (operational) methods:
– Petri nets
– Timed automata
– Synchronous languages (Esterel)
– Statecharts
– ASTRAL

40

20
Property-based Techniques:
Example
• Property-based (descriptive) methods based on logic:
– Temporal logic
– CTL (Computation Tree Logic)
– RTTL (Real-time Temp. Logic)
• Property-based (descriptive) methods based on algebras:
– VDM
– Z
– LOTOS
– CCS and CSP
– Process algebras

41

Formal Methods in Practice


• How formal methods work most efficiently
in practice:
– Describe a system formally
– Define its desirable properties (usually in a
different language)
– Verify properties (hopefully, with an automatic
tool)

42

21
Formal Methods and Real-time
Systems
• A real-time system is usually described via
an operational approach, but its properties
are defined via a descriptive approach

43

Formal Approaches to
Verification
• Formal approaches to verification:
– Model checking
– Theorem proving

44

22
Model Checking
• A technique that relies on building a finite
model of a system, in a certain language,
and checking that a desired property holds
in that model

45

Theorem Proving
• A verification technique, in which both the
system and its desired properties are
expressed as formulas in logic, and
verification relies on derivation of property
from axioms of the logic

46

23
Fault Tolerance
• Software must detect as many software
faults as possible
• Recover from as many faults as possible

47

Automatic tools
• PVS
• SMV
• Murfi
• Cabernet
• Hytech
• Uppaal
• TVS
48

24

You might also like