Lec 12

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 35

SOFTWARE

ENGINEERING
Lecture 12
Outlines
• An Initial Definition of (Software) Quality
• Standard Definitions of (Software) Quality
• Why we need Testing?
• Testing Definition
• Error, Fault and Failure
• Verification and Validation in software testing
• Testing Levels
• Cost of not testing
Initial Definition of Quality

• Quality can mean different


things to different people
• Quality =
• meeting the customer’s
requirements,
• at the agreed cost,
• within the agreed timescales.
• Quality = “Fitness for purpose”
• Quality = Customer satisfaction

3
Standard Definitions of (Software) Quality
• IEEE Glossary: Degree to which a system, component, or
process meets
• (1) specified requirements, and
• (2) customer or user needs or expectations

4
Software Quality Assurance (SQA)

“Software quality assurance is the set of systematic


activities providing evidence of the ability of software
process to produce a software product that is fit to use.”
QA is about engineering processes that assure quality is
achieved in an effective and efficient way
It is process oriented
Software Quality Control (SQC)
No formal definition in SE.
A set of activities to evaluate quality of development or
product.
QC detects bugs by inspecting and testing the product
which involves checking the product against a
predetermined set of requirements and validating that the
product meets those requirements
Quality control is product oriented
SE 2223 Software Engineering 7

SQA vs QC vs Testing
Testing is a subset of QC
It is the process of executing a system in order to detect
bugs in the product so that they get fixed
Testing is an integral part of QC as it helps demonstrate
that the product runs the way it is expected and designed
for.
Why we need testing?
• Computers do not make mistakes. In fact the humans,
who
use to develop software for computers, certainly do mistakes.
• A complex computer application, which controls various
aspects of the system such as
• sensing, speed, temperature, location,
communication in underwater submarines, Flight control
systems, Medical treatment
also need to be trustworthy/dependable.
• Therefore, developing complex computer system also
requires a process to ensure that the software is fit for
the purpose.
What is Testing?
• Software testing is a process of executing a program or
application with the intent of finding the software bugs.
• It can also be stated as the process of validating and 
verifying that a software program or application or
product:
• Meets the business and technical requirements that guided it’s
design and development
• Works as expected
What is Bug?
• When the result of the software application or product
does not meet with the end user expectations or the
software requirements then it results into a Bug or Defect.

• These defects or bugs occur because of an error in logic


or in coding which results into the failure or unpredicted
or unanticipated (unexpected) results.
Validation
• Process of checking whether the specification captures
the customer’s needs
• Validation is concerned with checking that the system will
meet the customer’s actual needs.
• Activities
• Requirement modeling (DFD, business use cases)
• Prototyping
• End user evaluation
• Testing
• Are We building the right product?
Verification:
• Process of checking that the software meets the
specification.
• It focuses on
• Whether the system is well engineered and it is error free?
• Whether it system is of high quality
• Rather than subjective, it is more focused towards the activities that
can be used to assess the quality of the software such as:
• Inspection
• Design analysis
• Testing
• Verification will help to determine whether the software is of
high quality, but it will not ensure that the system is useful.
• Are we building the product right?
QA Activities in V&V Context
• V & V → Validation and Verification
• Validation activities
• Related QA activities check whether a function needed and expected
by the customers is present in a software product
• Verification activities
• Related QA activities to confirm the correct or reliable performance of
these specified functions

13
Verification and Validation
• Verification and Validation in context of V&V model
Error, Fault and Failure
• Error
• It is a mistake made by human. This kind of mistake causes fault in the
system. This could happen because of : some confusion in
understanding the requirement of the software.
• A mistake in the source code indicates Error

• Fault
• appearance of an error in software, also known as fault. Fault is also
known as Defect or Bug.
• A Defect/Fault is a deviation between the actual and expected outcomes
• Failure
• When a system or piece of software produces an incorrect result or does
not perform the correct action, this is known as a failure. Failures are
caused by faults in the software.
Example: Error, Fault, Failure
• Example: Software for finding factorial
• Error: factorial is the sum 1 to N (instead of product!)
• Fault: use + operator instead of *
• Failure: wrong value produced

16
A Concrete Example
Fault: Should start
searching at 0, not 1

public static int numZero (int [ ] arr)


Test 1
{ // Effects: If arr is null throw NullPointerException [ 2, 7, 0 ]
// else return the number of occurrences of 0 in arr Expected: 1
int count = 0; Actual: 1
for (int i = 1; i < arr.length; i++)
{ Error: i is 1, not 0, on Test 2
if (arr [ i ] == 0) the first iteration [ 0, 2, 7 ]
{ Failure: none Expected: 1
count++; Actual: 0
}
} Error: i is 1, not 0
return count; Error propagates to the variable count
} Failure: count is 0 at the return statement
Relation among Error, Fault and Failure
Difference between Testing and Debugging
• Testing : Software testing is a process of executing a
program or application with the intent of finding
the software bugs.

• Debugging : The process of finding a fault given a failure


SE 2223 Software Engineering 20

Testing Lifecycle
SE 2223 Software Engineering 21

Testing Lifecycle
• Test Plan
• It is a systematic approach to test a system i.e. software
• The plan typically contains a detailed understanding of what the
eventual testing workflow will be
• Test case
• It is a specific procedure of testing a particular requirement
• It will include
• Identification of specific requirement tested
• Test case success/failure criteria
• Specific steps to execute test
• Test data
Type of Testing
• Black box Testing
• White Box Testing
Black box Testing
• Testing software against a specification of its external
behavior without knowledge of internal implementation
details
• The tester is unaware to the system architecture and does
not have access to the source code.
• Typically, when performing a black box test, a tester will
interact with the system’s user interface by providing
inputs and examining outputs without knowing how and
where the inputs are worked upon.
SE 2223 Software Engineering 24

Types of Black Box Testing


• Equivalence Partitioning
• Boundary Value Analysis
• Decision Table Testing
• State Transition Testing
• Error Guessing
• Graph-Based Testing Methods
• Comparison Testing
White box testing
• White-box testing (also known as clear box testing,
glass box testing, transparent box testing, and
structural testing) is a method of testing software that
tests internal structures or workings of an application,
• Knowledge of the internal program design and code
required
• Tests are based on coverage of code statements,
branches, paths, conditions.
SE 2223 Software Engineering 26

BBT vs WBT
• Black box testing
• No knowledge of internal program design or code required
• Tests are based on requirements and functionality
• White box testing
• Knowledge of the internal program design and code required
• Tests are based on coverage of code statements, branches, paths,
conditions.
SE 2223 Software Engineering 27

Testing Methodologies
SE 2223 Software Engineering 28

Types of White Box Testing


• Control flow testing.
• Data flow testing.
• Branch testing.
• Statement coverage.
• Decision coverage.
• Modified condition/decision coverage.
• Prime path testing.
• Path testing.
White-box testing techniques:
• Control-flow testing - The purpose of the control-flow
testing to set up test cases which covers all statements
and branch conditions. The branch conditions are tested
for both being true and false, so that all statements can be
covered.
• Data-flow testing - This testing technique emphasis to
cover all the data variables included in the program. It
tests where the variables were declared and defined and
where they were used or changed.
SE 2223 Software Engineering 30

Who Tests the Software?

•developer •independent tester

Understands the system Must learn about the system,


but, will test "gently“ and, but, will attempt to break it
is driven by "delivery" and, is driven by quality
SE 2223 Software Engineering 31

Testing Level
• Unit testing
• Tests each module individually
• Follows a white box testing (Logic of the program)
• Done by developers
• Integration testing
• Once all the modules have been unit tested, integration testing is
performed
• It is systematic testing
Produce tests to identify errors associated with interfacing
• Types:
• Big Bang Integration testing
• Top Down Integration testing
• Bottom Up Integration testing
• Mixed Integration testing
SE 2223 Software Engineering 32

System Testing

• The system as a whole is tested to uncover requirement


errors
• Verifies that all system elements work properly and that
overall system function and performance has been achieved
• Types of System Testing
• Alpha Testing
• Beta Testing
• Acceptance Testing
• Performance Testing
SE 2223 Software Engineering 33

System Testing
• Alpha Testing
• It is carried out by the test team within the developing organization
• Beta Testing
• It is performed by a selected group of friendly customer
• Acceptance Testing
• It is performed by the customer to determine whether to accept or
reject the delivery of the system
• Performance Testing
• It is carried out to check whether the system meets the
nonfunctional requirements identified in the SRS document
SE 2223 Software Engineering 34

System Testing Cont..


• Types of Performance Testing
• Stress Testing
• Load Testing
• Configuration Testing
• Compatibility Testing
• Recovery Testing
• Maintenance Testing
• Documentation Testing
• Usability Testing
SE 2223 Software Engineering 35

Regression Testing
• Regression testing is a type of software testing that
ensures that previously developed and tested software
still performs the same way after it is changed or
interfaced with other software
• Whenever software is corrected, some aspect of the
software configuration (the program, its documentation, or
the data that support it) is changed
• Regression testing helps to ensure that changes (due to
testing or for other reasons) do not introduce unintended
behavior or additional errors

You might also like