Introduction To Quality Assurance and Control

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 64

Software Quality

Assurance
Software Testing Principles and Concepts

1
INSTRUCTIONS
• Program will be conducted in SLIIT Metro Campus.

• Saturday – 4.00 pm to 6.00 pm.

• 1st -14th days Lectures and Practical.


• 15th day Online and Written exam.
• Assignment - Group Assignment.
Quality Assurance versus
Quality Control

3
Definitions of Quality
1. Conformance to requirements (Philip Crosby) –
Producer view: characterized by:
• Doing the right thing
• Doing it the right way
• Doing it right the first time
• Doing it on time without exceeding cost

2. Fit for use (Joseph Juran & Edwards Deming) –


Customer view: characterized by:
• Receiving the right product for their use
• Being satisfied that their needs have been met
• Meeting their expectations
• Being treated with integrity, courtesy and respect
4
Two Quality Gaps

5
Quality Attributes

6
Quality Management KA

7
Why Do We Test Software?

“Developers are unable to build defect-free software.”


• Developers are not good testers
• What is a defect?
• Why does a development process produce defects?
• Reducing the frequency of defects in software
development
• An effective development process that minimizes defects
• How is quality defined?

8
Developers are not Good
Testers
• Misunderstandings will not be detected
• Improper use of the development process may
not be detected
• The individual may be “blinded” into accepting
erroneous system specifications
• Information services people are optimistic in
their ability to do defect-free work
• an individual may be tempted to improve the
system structure and documentation

9
What is a Defect?

• A defect is an undesirable state.


• In order to know what a defect is we must first
define a desirable state.
• The term quality is used to define a desirable
state.
• A defect is defined as the lack of that desirable
state.
• In order to fully understand what a defect is we
must understand quality.

10
What is Quality Software?

• IT evaluates software against requirements


• User’s of software view of quality software means fit for use.
• Testers need to ensure that software not only meets the
requirements, but is in fact usable by the customer.

11
Why Does a Development
Process Produce Defects?
• Ideally, the software development process should
produce the same results each time the process is
executed.
• If its NOT, there is “variability” in the software development
process
• Variability is the “enemy” of quality
• The concept of measuring and reducing variability is
commonly called statistical process control
• Testers need to understand process variability,
• because the more variance in the process the greater the need
for software testing.

12
Examining Variation
A Stable Process has the same
normal distribution at all times.
A stable process is In Control

A stable process still has variation

13
Examining Variation

Common Causes

The cause of variations in a stable


process is called a Common Cause.
A common cause is a natural cause of variation in the system.

14
Examining Variation

Special Causes

Anything that causes variations that are not part of the stable
process is called a special cause, assignable cause, or
unnatural cause.

15
Reducing Variation
Reducing Variation in an
Unstable Process
Do not ignore special causes.
Do quickly detect special cause variations.
Do stop production until the process is fixed. (Reactive)
Do identify and permanently eliminate special causes.
(Preventive)

16
Reducing Variation
Improving an Unstable Process
Four Step Process
Detect the special cause variation.
Identify the special cause.
Fix the process
•Remove the special cause, or
•Compensate for the special cause.
Prevent the special cause from occurring again

17
Detecting Variation
Control Chart for Detecting Variation

Observe
Variation
Common Detect
Cause Control Special Cause

Don’t Chart Identify


Tamper
Fix
Reduce Overall
Variation Prevent
18
Reducing the Frequency of
Defects in Software Development

• experience showed that some software development


processes were much more effective than other
software development processes.
• algorithm is needed to identify the more effective and
efficient software development.
• This algorithm is now known as the capability maturity
model.
• Introduced by SEI
• The Five Levels of Maturity
• As the model moves from Level 1 to Level 5, the
variability in the process is significantly reduced.

19
Level 1- Ad-hoc

• At this level, management manages people and jobs.


• Management will establish goals or objectives for
individuals and teams
• This level is normally heavily schedule driven,
• and those that meet the schedules are rewarded.
• people’s performance is often dependent upon their
ability to convince management that the job they have
done is excellent.
• This causes the environment to be very political. Both
management and staff become more concerned with
their personal agenda than with meeting their
organization’s mission.

20
Level 2- Control

• instill discipline in the culture of the information


organization so that through the infrastructure,
training, and leadership of management individuals
will want to follow defined processes.
• The second objective is to reduce variability in the
processes
• At Level 2, the work processes are defined;
• management manages those processes, and uses
validation and verification techniques
• Having the results predefined through a set of
standards enables management to measure people’s
performance
• against meeting those standards.
• Education and training are an important component
of Level 2
21
Level 3-Core Competency

• organization defines its core competencies


• then builds an organization that is capable of performing those core competencies
effectively and efficiently.
• more common core competencies
• development, maintenance, testing, training, outsourcing, and operation.

• Once the core competencies are determined, then the processes defined at Level 2
must be reengineered to drive the core competencies.
• In addition, the tasks are analyzed to determine what skills are needed to perform
those processes.
• Next, a staff must be retrained, recruited, motivated, and supported to perform those
core competencies in an effective and efficient manner.
• It is the integration of people and processes, coupled with managers with people
management skills, which are needed to maintain and improve those core
competencies.
• Lots of mentoring occurs at this level, with the more experienced people building
skills in the less experienced.
• It is also a level that is truly customer focused
22
Level 4 – Predictable

• This level has two objectives.


• The first is to develop quantitative standards
for the work processes based on performance of
the Level 3 stabilized processes.
• The second objective is to provide management
the dashboards and skill sets needed to manage
quantitatively.
• The result is predictable work processes.

23
Level 5 – Innovative
• At Level 5, the information organization wants to be a
true leader in the industry.
• At this level, the organization is looking to measure
itself against the industry through benchmarking,
• and then define innovative ways to achieve higher
levels of performance.
• Innovative approaches can be achieved through
benchmarking other industries, applying new
technology in an innovative way, reengineering the
way work is done, and by constantly studying the
literature and using experts to identify potential
innovations.
• This level is one in which continuous learning occurs,
both in individuals and the organization.

24
The Multiple Roles of the
Software Tester
• The roles established for software testers vary
from organization to organization. Many
factors affect the tester’s role. The major factors
are:

• People relationships
• Scope of testing
• When should testing occur
• How should testing be performed
• Testing constraints

25
People Relationships

• Some attitudes that have shaped a negative


view of testing and testers are:
• Testers hold up implementation.
• Giving testers less time to test will reduce the chance
that they will find defects.
• Letting the testers find problems is an appropriate
way to debug.
• Defects found in production are the fault of the
testers.
• Testers do not need training; only programmers
need training.

26
People Relationships

• the top ten people challenges have been identified:


• Training in testing
• Relationship building with developers
• Using tools
• Getting managers to understand testing
• Communicating with users about testing
• Making the necessary time for testing

27
Scope of Testing

• The scope of testing is the extensiveness of the test process.


• A narrow scope may be limited to determining whether or not the
software specifications were correctly implemented.

• Among the broader scope of software testing are these


responsibilities:
1. Determining if the user’s needs have been met regardless of specs
2. Finding defects early when they cost less
3. Removing defects prior to going into a production
4. Identifying weaknesses in development process to improve process and
make the development process more mature.

• In defining the scope of software testing each IT organization must


answer the question, “Why are we testing?”

28
When Should Testing Occur?
• When testing is constrained to a single phase and confined to
the later stages of development, severe consequences can
develop.
• An error discovered in the latter parts of the life cycle must be
paid for four different times.
• The first cost is developing the program erroneously
• Second, the system must be tested to detect the error.
• Third, the wrong specifications and coding must be removed and the
proper specifications, coding, and documentation added.
• Fourth, the system must be retested to determine that it is now correct.
• Verification must not be isolated to a single phase in the
development process, but rather, incorporated into each phase
of development.

29
When Should Testing Occur?
• The recommended test process involves testing in every phase of
the life cycle.
• During the requirements phase, the emphasis is upon validation
to determine that the defined requirements meet the needs of the
organization.
• During the design and program phases, the emphasis is on
verification to ensure that the design and programs accomplish
the defined requirements.
• During the test and installation phases, the emphasis is on
inspection to determine that the implemented system meets the
system specification.
• During the maintenance phases, the system will be retested to
determine that the changes work and that the unchanged
portion continues to work.
30
When Should Testing Occur?

31
How the Test Plan Should be
Developed
Four steps must be followed to develop a customized test plan:
• Select and rank test objectives.
• Identify the system development phases.
• Identify the business risks
• Place risks in the matrix

32
Testing Constraints

• Anything that inhibited the tester’s ability to fulfill their


responsibilities is a constraint.
• Constraints include
• limited schedule and budget,
• an incomplete statement of work,
• changes in technology, and
• limited tester skills.

33
Concept
The Waterfall Model
Exploration
Process

System
Allocation
Process

Requirements
Process

Design
Process

Implementation
Process

Verification
& Validation
Process

Installation
adapted from [Royce 1970] Process

Operation &
Support Process
From the Waterfall Model to the V
Model
Acceptance
Requirements
Engineering

System
Requirements
Testing
Analysis

Integration
System Design Testing

Unit
Object Design Testing

Implementation

Unit
Testing

Integration
Testing

System
Testing
Activity Diagram of a V Model
Is validated by
System
Requirements Operation
Analysis
precedes
Software Client
Requirements Acceptance
Elicitation

Requirements System
Analysis Integration
& Test

Preliminary Component
Design Integration
& Test

Detailed Unit
Design Test

Implementation
Ways to Improve Quality

• Prevention of Defects
◼ Process Improvement
◼ Complexity Reduction
◼ Risk Management

• Detection and Correction of Defects


◼ Verification
◼ Validation
◼ Rework
Software Testing:
Verification and Validation

• Verification
“Are we building the product right?”

• Validation
“Are we building the right product?”

Barry Boehm, 1979


Verification

• Verification – Are we building the product


right?
• Verification is any checking process
conducted on software artifacts in an attempt
to determine if they work as specified by the
designers of the system.
• Includes reviews, inspections, walkthroughs,
unit testing and integration testing.
Validation

• Validation – Are we building the right


product?
• Validation is the process of evaluating
software artifacts during the software
development process in an attempt to
determine if the system works as required by
the customers. Any evaluation activity that
involves the customer can be used for
validation purposes.
• Includes program reviews, system testing,
customer acceptance testing.
Verification vs. Validation
• Verification ◼ Validation
◼ Main purpose is to detect ◼ Main purpose is to show
defects in the artifacts of the that the system under
system under development. development meets user
needs, requirements, and
expectations.

Verification vs. Validation techniques


◼ Static Methods ◼ Dynamic Methods
◼ Techniques applied to ◼ Techniques applied to artifacts
artifacts without execution. through execution.
Static: Reviews

• Walkthroughs
◼ Code verification
◼ Document
• ConOps, SRS validation
• STEP, SAD, SDD verification

• Inspections
◼ Code verification
◼ Document Audits verification
• Program Reviews
◼ Customer involved validation
◼ No customer verification
Effectiveness of Static Verification

• More than 60% of program defects can be


detected by program inspections.
• More than 90% of program defects may be
detectable using more rigorous mathematical
program verification.
• The defect detection process is not confused by
the existence of previous defects.
Static Testing
• Code analysis
◼ Unreachable code
◼ Objects declared and never used

◼ Parameters type/number mismatch

◼ Variable used before initialization

◼ Variable is assigned to twice, without using the first value

◼ Function results not used

◼ Possible array bound violations

◼ Code inspection
◼ Self - The default choice.
◼ Subtle errors and micro-flaws may be overlooked.
◼ Wrong conceptions propagate to review…
◼ Code review by others - Very efficient!
Dynamic: Testing (Verification)

• Unit Test (Detailed Design):


◼ Testing the individual software modules, components, or units.
• Integration Testing (Architectural Design):
◼ After unit test, the system is put together in increments. Integration testing
focuses on the interfaces between software components (OO thread-based,
cluster-based testing)
• System Testing (Requirements Spec):
◼ One goal of system testing is to ensure that the system functions as specified
in the specification.
Dynamic: Testing (Validation)
• System Testing (Requirements Spec):
◼ Another goal of system testing is to ensure that the system functions as the
client expected in a controlled environment.
• User Acceptance Test (ConOps):
◼ A set of formal tests run for the client, and specified by the client. When the
system passes these tests, the software has been accepted by the client as
meeting the requirements.
Program testing

◼ Can reveal the presence of errors NOT their


absence
◼ A successful test is a test which discovers
one or more errors
◼ The only validation technique for non-
functional requirements
◼ Should be used in conjunction with static
verification to provide full V&V coverage
Execution based testing

• “Program testing can be a very effective way


to show the presents of bugs but is hopelessly
inadequate for showing their absence”
[Dijkstra]

• Fault: “bug” incorrect piece of code


• Failure: result of a fault
• Error: mistake made by the
programmer/developer
Static and dynamic V&V
Verification and Validation in the
Development Lifecycle

Requirements Validate the System Execute Black-Box


Analysis System Tests Testing

Verify Design Execute


Design Integration
Tests
Verify Implementation

Execute Unit
White-Box &
Code Black-Box
Tests
Testing
Verification or… Validation?

• Reviews Either
• Unit testing Verification
• Integration Testing Verification
• System testing Either
• Acceptance testing Validation
The Test Plan

Requir ements System System Detailed


specification specification design design

System Sub-system Module and


Acceptance
integration integration unit code
test plan
test plan test plan and tess

Acceptance System Sub-system


Service
test integration test integration test
Testing vs. Debugging
• Verification and Validation Debugging
◼ looking and categorizing ◼ locating and correcting
existence of system these defects
defects ◼ “Why?”
◼ “What?”

◼ Testing is concerned ◼ Debugging is


with establishing the concerned with
existence of defects in a locating and
program. repairing errors.
Testing and debugging

◼ Defect testing and debugging are distinct


processes
◼ Verification and validation is concerned with
establishing the existence of defects in a
program
◼ Debugging is concerned with locating and
repairing these errors
◼ Debugging involves formulating a hypothesis
about program behaviour then testing these
hypotheses to find the system error
The Debugging Process

55
Software Inspections (code reviews)
• Why are reviews more effective for finding defects
in systems/subsystems (i.e., before acceptance
testing)?
• Bugs are often masked by other bugs
• Leverage domain/programming knowledge
• inspectors are skilled programmers

• Common practice: code reviews and then


acceptance testing
• reviews can also help with development of tests
Software Inspections

• It is a static V&V process.


• These involve people reviewing the SW system with
the aim of discovering errors, omissions and
anomalies.
• Inspections do not require execution of a system so
may be used before implementation.
• Generally inspection focus on source code, but any
readable representation of the system can be
inspected such as its requirements or design model.
• They have been shown to be more effective for
discovering program errors than program testing.
57
Software Inspections (code reviews)
• Sample procedure:
◼ Announce review meeting in advance (a week?)
◼ Provide design document, implementation overview,
and pointer to code
◼ Reviewers read code (and make notes) in advance of
meeting
◼ During meeting, directives recorded by Scribe
◼ Testers/documenters attend too
Software inspections
• These involve people examining the source representation with
the aim of discovering anomalies and defects.
• Inspections not require execution of a system so may be used
before implementation.
• They may be applied to any representation of the system
(requirements, design,configuration data, test data, etc.).
• They have been shown to be an effective technique for
discovering program errors.
Inspections and testing
• Inspections and testing are complementary and not
opposing verification techniques.
• Both should be used during the V & V process.
• Inspections can check conformance with a specification but
not conformance with the customer’s real requirements.
• Inspections cannot check non-functional characteristics such
as performance, usability, etc.
• Inspection success
• Many different defects may be discovered in a single inspection. In
testing, one defect ,may mask another so several executions are required.
• The reuse domain and programming knowledge so reviewers are likely
to have seen the types of error that commonly arise.
Inspection pre-conditions

• A precise specification must be available.


• Team members must be familiar with the
organisation standards.
• Syntactically correct code or other system
representations must be available.
• An error checklist should be prepared.
• Management must accept that inspection will
increase costs early in the software process.
• Management should not use inspections for staff
appraisal ie finding out who makes mistakes.
The inspection process
Inspection procedure
• System overview presented to inspection team.
• Code and associated documents are
distributed to inspection team in advance.
• Inspection takes place and discovered errors
are noted.
• Modifications are made to repair discovered
errors.
• Re-inspection may or may not be required.
THANK YOU

64

You might also like