Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

TRIAL MODE a valid license will remove this message.

. See the keywords property of this PDF for more information.

Software Engineering
Chapter 06

Software Quality Assurance


CS 446102
NTHUCS

1 2

References What is Software Quality?


Most of the slides in this lecture are adapted from: Conventional definitions:
Z Roger S. Pressman, Software Engineering: A Z Conformance to requirement ():
Practitioner's Approach, Chapters 19-21, 8th The requirements are clearly stated and the product must
Edition, McGraw-Hill, 2014. conform to it.

Z Daniel Galin, Software Quality Assurance: From Any deviation from the requirements is regarded as a
defect.
Theory to Implementation, Addison Wesley, 2004.
A good quality product contains fewer bugs.
Z Pankaj Jalote, Software Project Management In
Practice, Addison-Wesley, 2002. Z Fitness for use ():
Fit to user expectations: meet users needs.
Z Stephen H. Kan. Metrics and Models in Software
A good quality product provides better user satisfaction.
Quality Engineering, Addison-Wesley, 2nd edition,
September 2002. 3 Both Dependable computing system 4
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Fitness Fitness (contd.)


Design quality is all about fitness to purpose
Z It is a measure of the relationship between
Z does it do what is needed?
software and its application domain
Z does it do it in the way that its users need it to?
might not be able to measure this until you place the
Z does it do it reliably enough? fast enough?
software into its environment
safely enough? securely enough? and the quality will be different in different environments!

Z will it be affordable? will it be ready when its Z During design, we need to be able to predict how
users need it? well the software will fit its purpose
Z can it be changed as the needs change? we need to understand that purpose (requirements

But this means quality is not a measure of analysis)

software in isolation 5 we need to look for quality predictors 6

Software Quality Definition


Quality Assurance Functions
ISO 9126 definition of quality:
Z The totality of features and characteristics of a Consists of a set of auditing and reporting
software product that bear on its ability to satisfy functions that assess the effectiveness and
stated or implied needs (
completeness of quality control activities
)
Provides management personnel with data that
IEEE definition of quality:
Z The degree to which a system, component, or provides insight into the quality of the products
process meets specified requirements. Alerts management personnel to quality problems
Z The degree to which a system, component, or so that they can apply the necessary resources to
process meets customer or user needs or resolve quality issues
expectations. 7 8
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

The SQA Group Total Cost Distribution

Serves as the customer's in-house representative Maintenance is responsible for more that
60% of total cost for a typical software
Assists the software team in achieving a high-
project
quality product
Views the software from the customer's point of
view
Z Does the software adequately meet quality factors?
Z Has software development been conducted
according to pre-established standards?
Z Have technical disciplines properly performed their Developing better quality system will
roles as part of the SQA activity? 9 contribute to lowering maintenance costs 10

Example: ATM System Quality Example: ATM System Quality


Identify the quality attributes for the ATM System.
(contd.)
Z The banks motivation for developing the system is
Z Security is critical; the system must be fully
to attract new customers by offering low banking
integrated into existing enterprise security
fees, and a variety of services. The bank will also
infrastructure. More specifically the ATM system
be able to reduce its wage costs by processing an
will reuse an existing secured database.
increased number of banking transactions
Z The time for 90% of the users to learn (through
automatically through the system instead of
manually through cashiers. It is essential for them supplied step-by-step instructions) how to use the

to lower the development cost and to minimize the first time the system must not be more than 5

product support cost. 11


minutes. 12
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Example: ATM System Quality Software Metrics


(contd.)
What is a metric?
Z When a user issues a request, the system should Z A metric (
respond with a verification of the request within ) is a quantifiable measurement of
1.0 second in 90% of the cases. The time software product, process, or project that is
for the verification must never exceed 10.0 directly observed, calculated, or predicted.
second, unless the network connection is broken
Applying metric gives numeric value for some
(in which case the user should be notified).
property as a result, but that alone is not useful
Z The ATM System must have no more than 1 in general
hour per month of down time. Every metric model should define:
13 14

Software Metrics (contd.) Software Metrics (contd.)


Z how it should be measured
An organization must have established a
Z limits: what does measured values mean (what is
baseline understanding of its current software
good/bad)
product and process characteristics.
Z information about how metric is used to improve

software quality
relation with quality factors

what can be done with software to improve result

Software metrics are emerging as a powerful tool


for the management of the software development
process. 15 16
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Process Quality Metrics Categories Example: Error Density Metrics


Software process quality metrics

Z Error density metrics

Z Error severity metrics

Z Error removal effectiveness metrics

Software process timetable metrics

Software process error removal effectiveness


metrics

Software process productivity metrics 17 18

Example: Error Severity Metrics Example: Process Productivity


Metrics

19 20
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Product Quality Metrics Product Quality Metrics (contd.)


Refer to the software operational phase (years of
Z Defect density (defect rate)
regular use by customers) include
number of defects relative to software size
Z Help desk services (HD)
Z Customer problems metrics
Z Corrective maintenance services
Z Customer satisfaction metrics
Category of metrics affecting software operational
phase
Z Mean time to failure (MTTF)
measures time between failures

collection with user profiles, scenarios, etc


21 22

Example: Severity of HD Calls


Example: HD Calls Density Metrics
Metrics and HD Success Metrics

23 24
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Better Quality via Standards? Two Ways of Monitoring Software


Quality
Most products have safety standards, and many
have usability standards, but computer software Two ways of Monitoring Software Quality
rarely has such standards. Z Fixed model approach
Can quality be improved by enforcing standards? Z Define your own model approach
Z It is very difficult to enforce standards on actual Fixed model approach
program behavior Z we assume that all important quality factors needed
Z Standardizing the process can help make sure to monitor a project a (sub)set of those in a
published model
that no steps are skipped, but standardizing to an
Z provide a worthwhile basis for indirect measures
inappropriate process can reduce productivity, and
and an excellent checklist for assessing the quality
thus leave less time for quality 25 of a system 26

Pre-release Quality Software Reliability Engineering


Process: John D. Musa Waterfall Model
Software inspection and
Establish
testing Reliability
Goals Engineer Just
Methods: Right Reliability
Develop
Z Software Reliability Operational
Profile
Engineering
Z ISO 9001, 9126, CMMI
Test

Use Test
Results

27 28
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

ISO 9000 & ISO 9001 ISO 9000 & ISO 9001 (contd.)
The international organization ISO has developed a ISO 9004: Guidelines for performance improvements
series of standards for QA and quality. (ISO 9004 )
management, collectively known as the ISO 9000.
Z Note: ISO 9002 and ISO 9003 were parts of
Z The standards released in the year 2000 are known
as ISO 9000:2000. There are three components of ISO 9000:1994, but these are no more parts of
the ISO 9000:2000 standard. ISO 9000:2000.
ISO 9000: Fundamentals and vocabulary (ISO 9000
Z ISO 9000-3 is a supporting document, which
)
ISO 9001: Requirements (ISO 9001 interprets ISO 9000 for software development.

ISO 9001 is for quality management.
20 )
Z ISO 9001 is the most general of these standards and applies to ISO 9001 helps organizations to implement
organizations concerned with the quality process in products design,
29 30
development, and maintenance quality management.

Areas Covered by ISO 9001 Areas Covered by ISO 9001


(contd.)
ISO 9001 standard applies to software
engineering 20 areas.

To obtain ISO registration, a formal audit of 20


elements is involved and the outcome has to be
positive.

Guidelines for the application of the twenty


elements to the development, supply, and
maintenance of software are specified in ISO
9000-3. 31 32
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Capability Maturity Model (CMM) Key Process Areas of CMMI


Five levels of process maturity Continuously
Optimizing Process Change Management
improving
(5) Technology Change Management
process
Optimizing Defect Prevention
5 Focus on continuous
process improvement Predictable Managed Software Quality Management
process (4) Quantitative Process Management
Managed
4 Process measured Peer Reviews
and controlled
Standard, Defined Intergroup Coordination
Defined consistent (3) Software Product Engineering
process
3 Process Integrated Software Management*
characterized for the Training Program
organization and is Repeatable Organization Process Definition*
proactive Disciplined (2)
Repeatable Organization Process Focus*
process
2 Process characterized Software Configuration Management
for projects and is often
reactive
Initial Software Quality Assurance
Initial (1) Software Subcontract Management
1 Process unpredictable, Software Project Tracking and Oversight
poorly controlled, and
reactive
Software Project Planning
33 Requirements Management 34

ISO vs CMMI (contd.) Boehms Quality Model


CMMIISO9001 Reflects an understanding of quality where the
software:
CMMIISO 9000
Z does what the user wants it to do

Z uses computer resources correctly and efficiently
CMMI
ISO 9000 Z easy to learn and use
ISO 9000 Z well-designed, well-coded, easily tested and
CMMI maintained
CMMIISO 9000 B. W. Boehm1978(ICSE)
ISO 9000 CMMI L23
35 36
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Quality Models: CUPRIMDA McCalls Triangle of Quality


Quality parameters
(parameters for fitness):
Z Capability
Z Usability
Z Performance
Z Reliability
Z Installability
Z Maintainability
Z Documentation
Z Availability
Reference: S. H. Kan (1995) 37 38

ISO 9126 Quality Model ISO 9126 Quality Model


(contd.)

Factors

39 Criteria Metrics (see ISO9126-2, ISO9126-3) 40


TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Example: Functionality Accuracy Example: Functionality Accuracy


(contd.)
Accuracy definition: attributes of software that
Measurement:
bear on the provision of right or agreed results or Z Number of faults per phase / priority / category /
effects. cause
Z The degree to which a component meets specified Z Number of faults per time period (fault rate)

requirements or user needs and expectations. Z Number of open problem reports per time period

Z The ability of a component to produce specified Z Number of closed problem reports per time period
Z Number of unevaluated (pending) problem reports
outputs when given specified inputs, and the
Z Age of open problem reports
extent to which they match or satisfy the
Z Age of unevaluated problem reports
requirements. 41 Z Age of closed problem reports 42

Example: Functionality Security Example: Functionality Security


(contd.)
Security definition: attributes of software that
Z Level 4: intrusion protection
bear on its ability to prevent unauthorized access,
Z Level 5: combination of level 1-4
whether accidental or deliberate, to programs and
data. Measurement: Security level (Lsc)

Software system security is defined at the Lsc nt nint


following levels: nt : number of successful intrusions
Z Level 0: no security at all nint : total number of intrusion attempts
Z Level 1: firewalls Lsc takes values of
Z Level 2: encryption Z 0.1 ~ 0.001 for level 1 and
Z Level 3: authentication (digital ID verification) 43
Z 10-7~10-9 for level 4 44
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Example: Reliability Maturity Example: Reliability Maturity


Fault Detection Fault Removal

The fault detection metric can be calculated The fault removal metric can be calculated by

by X = A / B, where A is the absolute number of X = A / B, where A is the number of bugs fixed

bugs detected (from the review report) and B is during design and coding and B is the number

the estimated number expected. that were found during review. The closer the
value is to 1, the better.
A high value for this metric implies a good level
of product quality. A fault removal value of one would mean that
every detected defect had been removed.

45 46

Example: Reliability Recoverability Example: Reliability Recoverability


() (contd.)
The recoverability metrics are used for
The metric can be calculated by X = A / B, where
assessing the software systems capability to
A is the number of implemented restoration
reestablish an adequate level of performance and
requirements meeting the target restore time and
the ability to recover the data directly affected in
B is the total number of requirements that have
case of a failure.
a specified target time.
Restoration effectiveness () : A
If this capability is actually implemented, and we
measure of how effective the restoration
calculate/simulate that it would actually work
techniques will be. Remember that all of these
correctly, the metric would equal 1/1. If not
are internal metrics that are expected to come
47 implemented, the metric would be 0/1. 48
from static testing.
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Example: Portability Porting User Maintainability Analyzability


Friendliness () Audit Trail Capability
The audit trail capability () is a
The porting user friendliness is a measure
measure of how easy it is for a user (or maintainer)
of how effortless the porting operations on the to identify the specific operation that caused a
project are estimated to be. failure.

This porting user friendliness metric uses the This metric can be calculated by X = A / B, where A
is the number of data items that are actually
same formula, X = A / B, where A is the number of
logged during the operation and B is the number
functions being ported that are judged to be easy,
of data items that should be recorded to
as based in review, and B is the total number of sufficiently monitor status of the software during
functions that are required to be easy to adapt. 49 operation. 50

Maintainability Analyzability Maintainability Changeability


Audit Trail Capability
(contd.) The Change recordability ( )is a
measure of how completely changes to
A value for X closer to 1 means that more of the specifications and program modules are
required data is being recorded. documented.
This metric can be calculated by X = A / B, where
A is the number of changes that have comments
(as confirmed in reviews) and B is the total
number of changes made. The value should be 0
<= X <= 1; the closer to 1 the better. A value near
51 0 indicates poor change control. 52
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

What can We do to Remove Errors What can We do to Remove Errors


in Early Development Phases in Early Development Phases
(contd.)
Example: Defect Amplification and Removal
Assumption: amplification 1:0.5
Preliminary design

0 Detailed design
0 0% 10
Errors passed through 10 Coding
10 5 0% 40
Percent 40
efficiency 25 85
20 0%
Amplified errors 1:x for error
detection 25
Unit testing
Errors from Errors
85
previous step passed to 85 Integration testing
Newly generated errors 42
0 50%
next step 42
0 21
0 50%
53 0 54

What can We do to Remove Errors What can We do to Remove Errors


in Early Development Phases -- in Early Development Phases --
Revisited Revisited (contd.)
Assumption: amplification 1:0.5 Non-execution-based testing !
Preliminary design

0 Detailed design
0 50% 5 (10)
5 Coding
Z Reviews: Inspections & walkthroughs
10 3 50% 16 (40)
16
25 50%
25 (85) Z Presentations
8
25
Unit testing
25 (85)
25 Integration testing
12 (42)
0 50%
12
6 (21)
0 0 50%
0 55 56
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Defect Introduction and


V Model: Static Tests
Removal Model Boehm and
Jones
Defects Injected

57
Defects removed 58

Review Pair Programming


A variant of program inspection associated with
IEEE: agile development methods
Two programmers work side-by-side at a computer:
A process or meeting during which a work
one types, the other reviews and inspects the
product, or set of work products, is newly typed code
presented to project personnel, managers, Features:
users, customers or other interested Z No checklist
parties for comment or approval Z Fine grain

Z Tied to practices facilitating teamwork and


59 concentration 60
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

The Relationship between Reviews


Reviews Result in Less Effort
and Inspections
In a review, you personally review your own
program. An inspection is a team review of a
program.
Z After personal reviews, inspections are the most

valuable quality technique a software team can


use.
If your objective is to achieve 100% process yield,
incorporate inspections into your personal
61 62
process.

The Relationship between Reviews The Relationship between Reviews


and Inspections and Inspections (contd.)
When using inspections, you must decide where in When there are many simple problems in the code,
the process to put them. The principal questions the reviewers are less likely to see the important
are whether to review the code before the issues. Because you can easily find most of the
inspection and whether to compile and test the
simple problems yourself, thoroughly review your
code before the inspection.
designs and code and make them as clean as you
The people who inspect your code are taking their
precious time to help you improve the quality of can before having them inspected.

your product. To show your appreciation, treat If the quality of the code entering an inspection is
their time as important and carefully review your so important, why not also unit-test it first? This is
code before the inspection. 63
a question of inspection psychology. 64
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

The Relationship between Reviews The Relationship between Reviews


and Inspections (contd.) and Inspections (contd.)

When the inspectors know that the code has It is recommended that we not test our programs

been tested, they are not as likely to do a before the first code inspection. The inspection

thorough analysis. objective should be to have no defects found in


testing.
Even though most programmers understand that
Your personal review goal, of course, is to have no
an initial unit test will not find all of the defects in
defects found in compiling, inspections, or testing.
a complex program, the fact that the program is
known to run can be demotivating.
65 66

Types of Reviews Management Review


Objectives:
Management Review Z Inform management of project status

Formal Technical Reviews Z Resolve higher-level issues (management


decisions)
Z Agree on risk mitigation strategies
Decisions
Z Identify corrective actions
Z Change resource allocation
Z Change project scope

67 Z Change schedule 68
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Types of Formal Technical Review


Types of Formal Technical Review
(contd.)
There are a number of types of formal technical Walkthroughs ()
review ranging in formality and effect. Z Walkthroughs are the most common form of
Buddy Checking review, and they are the least formal
Z Having a person other than the author informally Z There is generally no preparation ahead of the
review a piece of work.
walkthrough, and usually little or no
Z Generally does not require collection of data
documentation is produced.
Z Difficult to put under managerial control
Z Generally involve the author of an artifact
Z Generally does not involve the use of checklists
presenting that document or program to an
to guide inspection and is therefore not repeatable.
69 audience of their peers 70

Types of Formal Technical Review Walkthrough


Review Form
(contd.)

Z The audience asks questions and makes

comments on the artifact being presented in an


attempt to identify defects
Z Often break down into arguments about an issue

Z Usually involve no prior preparation on behalf of

the audience
Z Usually involve minimal documentation of the

process and of the issues found 71 72


TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Walkthrough Types of Formal Technical Review


Action List (contd.)
Audits
Z Audits are usually performed by some external

group, rather than the development team.


Z Audits may be conducted by a SQA group, a

project group, an outside agency, or possibly a


government standards agency.
Z Audits are not primarily concerned with finding

defects-the main concern is conformance to


73 74
some expectations, either internal or external.

Types of Formal Technical Review


(contd.) The Audit
Process
Z Audits may be required by contract, and an

unsatisfactory audit usually results in expensive


corrective actions.
Z Audits are conducted for the purpose of

uncovering nonconformances (NCs;),


if any, in a project. The output of an audit is a
nonconformance report (NCR). An NCR lists all
NCs uncovered during an audit.
75 76
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Nonconfor
Nonconfor
mance
mance
report
report
(contd.)

77 78

Types of Formal Technical Review Types of Formal Technical Review


(contd.) (contd.)
Technical Inspection () Z Involve a review team with clearly defined roles
Z An inspection is a visual examination of a Z Specific data is collected during inspections
software product to detect and identify software
Z Inspections have quantitative goals set
anomalies, including errors and deviations from
Z Reviewers check an artifact against an
standards and specifications.
unambiguous set of inspection criteria for that
Z Pioneered by Michael Fagan while he was at IBM
type of artifact
in the 1970s, technical inspections are the most
Z The required data collection promotes process
effective form of software reviews.
improvement, and subsequent improvements in
Z Formally structured and managed peer review
79 quality. 80
processes
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Comparison of Review Types What Are Inspections

The main characteristics of the three review types In an inspection, 2 or more engineers review another
are summarized in the following table. engineer's product to find its defects and problems.

Z Their objective is not to fix these problems but rather

to find them so that the developer can fix them.

The time to do an inspection is when an engineer has


finished developing a product, has personally
reviewed it, and has corrected all obvious problems.

Z At that point, the engineer usually needs help in

81 finding the remaining defects. 82

What Are Inspections (contd.) How Are Inspections Done?


The inspection process has four phases: the
Inspections can be used on almost any kind of
briefing phase, the review phase, the meeting
software-related product. Some examples are
phase, and the repair phase.
program source code, detailed or high-level Z In the briefing, the inspection team learns about
designs, requirements documents, test cases, the product and agrees on how to do the inspection.
etc. Z In the review phase, each reviewing engineer
personally reviews the product to find defects.
Z In the meeting phase, the entire inspection team
goes through the product one section at a time.
Their objective is to identify the defects and resolve
83 84
any questions.
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

How Are Inspections Done? When to Do Inspections


(contd.) Start doing inspections as soon as your team
starts producing products.
Z In the repair phase, the developing engineer
The earlier that defects are injected and the
fixes the identified problems and verifies the
longer they remain in the product, the harder they
corrections. are to find and fix.
Thus, the longer you delay inspecting a product,
the larger the volume of incorrect work, and the
longer it will take to find and fix the problems.
So start doing inspections at the beginning of the
project and keep doing them until all the product
85 elements have been developed and inspected. 86

Using a Defined Inspection Process What Makes Inspections Effective?

To be most effective, inspections should follow a There are four reasons that inspections are
defined process. Although inspections are not effective:
complicated, they involve a number of steps. A Z They look at the entire program at one time
defined process tells you what steps to take and Z They use more combined knowledge
when to take them. Z They take advantage of different viewpoints

It also defines the key inspection measures and Z They improve the odds of finding problems.

provides forms for tracking, measuring,


evaluating, and improving your inspections.
87 88
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Review Objectives Review Objectives (contd.)

If your objective is to get into testing as quickly If you measure your overall performance both with
as possible, there is probably no way to reviews and without, you will find that by
convince you to do a review before compiling. reviewing first, you may take a little longer to get
This attitude, however, confuses speed with into testing but the testing time will be much less.
progress. If your goal is to remove the maximum number of
Z Although getting to testing would seem like defects, you will want to do the code reviews when
progress, the testing time for defective code is they are most effective. The compiler is equally
highly unpredictable, and for large programs it effective before or after the review, so if you find
can take many weeks or months. 89 any of the defects it would miss, you are ahead. 90

Review Objectives (contd.) A Review Process

If the compiler finds more than a very few defects,


we probably have a quality problem and need to
reexamine our review practices. This is also a
good guide for when to update your code review
checklist.

91 92
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Planning Planning (contd.)


Objectives
Z Moderator plans dates for meetings/books rooms
Z Gather review package: work product, checklists,
Z Moderator checks work product for readiness
references and data sheets
Z Moderator helps Author prepare overview
Z Form inspection team

Z Determine dates and locations for meetings

Procedure

Z Moderator assembles team and review package

Z Moderator enhances checklist if needed


93 94

An Example of Planning Data Orientation


Objectives
Z Author provides overview
Z Reviewers obtain review package
Z Preparation goals established
Z Reviewers commit to participate
Procedure
Z Moderator distributes review package
Z Author presents overview (if necessary)
Z Scribe duty for Inspection Meeting assigned

95 Z Moderator reviews preparation procedure 96


TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

An Example of Orientation Data Contents of an Inspection Packet


Work Product Requirements

Frozen Work Product

Standards and Checklists

Review Issues Spreadsheet

Review Reporting Forms

Fault Severity Levels

Review Report Outline

97 Z Below is a sample outline of a review report 98

1. Introduction Preparation
a. Work product identification
b. Review team members and roles
Objectives
2. Preliminary issue list
a. Potential fault Z Find maximum number of non-minor issues
b. Severity Procedure
3. Prioritized action item list
a. Identified fault Z Allocate recommended time to preparation
b. Severity Z Perform individual review of work product
4. Summary of individual reports
Z Use checklists and references to focus attention
5. Review statistics
a. Total hours spent Z Note critical, severe and moderate issues on
b. Faults sorted by severity
Reviewers Data Sheet (RDS)
c. Faults sorted by location
6. Review recommendation Z Note minor issues and author questions on work
7. Appendix with the full review packet 99
product
100
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

An Example of Issue Classification An Example of Issue Classification


(contd.)
Critical
Z Defects that may cause the system to hang, Moderate

crash, product incorrect results or behavior, or Z Defects that affect limited areas of functionality

corrupt user data; no known work-arounds that can be either worked around or ignored

Severe Minor

Z Defects that cause incorrect results or behavior Z Defects that can be overlooked with no loss of

with known work-arounds; large and/or functionality

important areas of the system are affected


Moderate 101 102

Checklists Checklists (contd.)


Z Language usage
Checklists are key
An inspection is as good as its checklist whether Z Use-declaration type consistency

explicit (written down) or implicit (in the Z Invocation interface consistency

inspectors mind) Formal-actual correspondence

The work product is examined with reference to Return value

specific checklists Z Use of correct dimensions/units, e.g. C/ F


Z Compliance to standards
Z Comment-code consistency
Z Domain correctness
Z Avoidance of errors in historical list (frequency
Z Consistency with requirements items 103 104
ranked)
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

An Example of Checklist The Review Team


Review is not full-time!
Z Productivity drops quickly during a session

Z Reviewers are usually borrowed from other roles

Z Tradeoff between cost, background and


perspective
More expert = more valuable, but more costly

More perspectives = bigger teams

Different levels of reviews


Z Junior engineers can do simpler checks while self-
105 training on standards and practices 106

The Review Team (contd.) Rules for Reviewers

Z Senior engineers can be paired with junior I. Be prepared


engineers when checking more complex
II. Evaluate product, not people.
properties
Z Larger groups when special expertise is required III. Watch your language.

IV. One positive, one negative.

V. Raise issues, dont resolve them.

VI. Record all issues in public.

VII. Stick to technical issues.


107 108
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Rules for Reviewers (contd.) Rules for Reviewers (contd.)


If you find something, assume its a mistake. These
are your peers, not your enemy. Remember people Do not get bogged down in style issues. For

are involved example, if efficiency isnt an issue, then dont

Avoid why did you why didnt you say make it one.

instead I dont understand If it makes things less maintainable, thats an


Z Why did you set the upper bound to 10 here? issue.

Z I dont understand why the upper bound is 10 If there are standards, then either stick to the
here. standard, or dispose of the standard.
Z Seems trivial, but its not.
109 110

Number of Reviewers Efficiency


One way: reduce the number of inspectors
Ensure material is covered
In the original method 4 inspectors were the norm
Have enough
and the IEEE standard suggests 3-6
Dont have too many
Z If we reduce to 3 from 4 we save 25% on
3-7 is good traditional Inspections

Z Count participants only Z Reduce to 2 and we save 50%

Z 3: ensures material is understood Z When it is possible to perform an Inspection with

only one inspector, we could save 75% of costs of


Outsiders can be good. Must be unbiased
111 Inspections 112
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Efficiency (contd.) Time

Of course, all efficiency improvements should At most 2 hours at a time


build upon and try to sustain the proven
Come back again if necessary
effectiveness
Depends on
Maintain >90% effectiveness while reducing the
number of inspectors Z Complexity

Z Size of project

Z Closeness to completion

113 114

Everyone Must Prepare! Roles and Responsibilities

Review packet

Z Everything relevant to making correct judgment:

code, specs, test data, results, standards,


Reader Author Reviewer
designs,

80% of failures stem from unprepared teams

Moderator Scribe Management


115 116
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Reviewer Roles and the Product


Roles and Responsibilities (contd.)
Under Review
Main roles: Moderator, reader, scribe, author,
reviewer

Moderator overall responsibility

Reviewer to identify defects

Reader not there in some processes, reads line by


line to keep focus

Scribe records the issues raised


117 118

Individual Reviewer Issues


Overview and Self-Review Spreadsheet
A brief meeting deliver package, explain purpose
Individual reviewers identify issues and submit them
of the review, intro,
to the review leader. The spreadsheet greatly
All team members then individually review the work
facilitates the process that the review leader uses to
product
merge the inputs from the full inspection team.
Z Lists the issues/problems they find in the self-
preparation log
Z Checklists, guidelines are used

Ideally, should be done in one sitting and issues


recorded in a log
119 120
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Review Report Spreadsheet Infosys Review Capability Baseline


Once reviewers complete their examination of the
work product, they submit an individual review
report form to the review leader.

121 122

Tactics for Participation


Devils Advocate

Z One person makes strongest case against a

product

Z Job is to find fault

Z Needs to be an actor

Debugging

Z Producers leave known bugs in

Z Quality of review depends on how many found


123 124
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Tactics for Participation (contd.)

Alarms

Z Time each persons contribution, cut off

Stand-up Reviews

Z Talk as long as you stand on one leg

125 126

Group Review Meeting Group Review Meeting (contd.)


Purpose define the final defect list
At the end of the meeting
Entry criteria each member has done a proper
Z Scribe presents the list of defects/issues
self-review (logs are reviewed)
Z If few defects, the work product is accepted; else it
Group review meeting
might be asked for another review
Z A reviewer goes over the product line by line
Z Group does not propose solutions though some
Z At any line, all issues are raised suggestions may be recorded
Z Discussion follows to identify if a defect Z A summary of the inspections is prepared useful

Z Decision recorded (by the scribe) 127 for evaluating effectiveness 128
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Group Review Meeting (contd.) Group Review Meeting (contd.)

Moderator is in-charge of the meeting and plays a Review rate should be about 140 lines of text /hr
central role (no more than 280 LOT/hr) for design documents.
Z Ensures that focus is on defect detection and
Review rate should be about 125 SLOC/hr (no
solutions are not discussed/proposed more than 250 SLOC/hr) for code.
Z Work product is reviewed, not the author of the
No more than two meetings per day.
work product
Size of team is four
Z Amicable/orderly execution of the meeting

Z Uses summary report to analyze the overall

effectiveness of the review 129 130

Report Rework
Procedure
Goal: Does the product do the job its supposed
Z Author obtains Scribe Data Sheet containing
to do?
consolidated issues list as well as copies of work
Z Complete.
products
Z Correct.
Z Author assesses each issue and notes action
Z Dependable as basis for other work
taken using the Author Data Sheet
Z Measurable for purposes of tracking.
Z Author determines the type of each defect
What was reviewed? Requirements; specification; design; implementation;
Signatures of reviewers. Z When finished, Author provides Author Data Sheet
Leader and recorder. 131 and reworked products to Moderator for Followup 132
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Followup
An Example of Rework Data
Objectives
Z Assess the (reworked) work product quality

Z Assess the inspection process

Z Pass or fail the work product

Procedure
Z Obtain reworked product and Author Data Sheet

Z Review work product/data sheet for problems

Z Provide recommendation for work product

133 Z Perform sign-off with reviewers 134

Followup (contd.) An Example of Followup Data

Z Compute summary statistics for inspection

Z Generate any process improvement proposals

Z Enter review data into quality database

135 136
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Summary
Report of Cost of Defects
A Review
What is the impact of the annual cost of software
defects in the US?

$59 billion
Estimated that $22 billion could be avoided by
introducing a best-practice defect detection
infrastructure

Source: NIST, The Economic Impact of Inadequate Infrastructure


for Software Testing, May 2002
137 138

Cost of Software Reviews Cost of Software Reviews (contd.)


Up-front cost for
Z 1994 (after software reviews instituted): 5% of
Z Training
project cost in defect correction
Z Staff preparation

Z Review Conduct

Code review may be 8-15% of total cost of project


Offset by savings: reduced rework later in project
(25-35% of total project cost saved)
Z Raytheon: pre-1988: 43% of software cost in

defect correction 139 140


TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Relative Inspection Costs Across Inspection Effectiveness Across


CMM Levels CMM Levels

Assumes Level 1 effectiveness is 50%


141 142

Code Inspection Effectiveness Fentons Example: Software Review


at Fujitso (Cusumano) and Inspection (1997)
Year Defect detection method Defects per
1000 lines of
Test Design Code maintained
review inspection
% % % code
1977 85 --- 15 0.19
1978 80 5 15 0.13
1979 70 10 20 0.06
1980 60 15 25 0.05
1981 40 30 30 0.04
1982 30 40 30 0.02
143 144
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

How Many Faults Can Be Found


By Inspection? A Two-
engineer
Estimating
Example

145 146

Using the Log Form Using the Log Form (contd.)


The first section describes the artifact being This section of the form contains information about
inspected, and summarizes the individual reviewers major defects found and indicates which reviewers
data (from the review form in the previous stage if found them. Some variations also log minor defects
used). here.

147 148
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Using the Log Form (contd.) Inspection Metrics


The next section of the found simply totals the data Total Defects Found = A + B C, where A and B
in the columns for the reviewers. are the number found by reviewer A and B
respectively and C is the number found by both A
and B.
Thus, Estimated Total Defects = (AB)/C .
In this example, we have A = 8, B = 9, C = 6, so
The final section of the form is used for metrics Estimated Total Defects = (8x9)/6 = 12.
calculation.

149 150

Inspection Metrics (contd.) Inspection Metrics (contd.)


The inspection yield is the term used to describe the Defect density is the ratio of the number of defects
Defect Removal Efficiency (DRE) of an found (maybe estimated total) to the size of the
inspection process. So the defect removal efficiency artifact

(yield) of an inspection is given by: Defect Density = Total Defects Found / Size
Yield = Total Defects Found / Estimated Total In the example, we get: Defect Density = 11 defects /
Defects 100% 15 pages = 0.73 defects/page
In the example above, Total Defects Found = 11 and Before we can calculate any overall rates we must
Estimated Total Defects = 12. So, first total the amount of time spent all up in the
Yield = Total Defects Found/Estimated Total Defects inspection.
= (11/12) 100% Z This is the sum of each reviewers review time plus
= 91.67% 151 the total person time spent in each meeting 152
TRIAL MODE a valid license will remove this message. See the keywords property of this PDF for more information.

Inspection Metrics (contd.) Inspections Today


For example, if reviewer A spends 60 minutes in Still viewed as labor intensive and low tech
review and reviewer B spends 75 in review and in Effectiveness still not clearly appreciated by all
total 4 people spend 30 minutes in meetings (120
practitioners
minute meeting time pretty low), then the total time
Resistance to efficiency improvements when
cost of the inspection is 255 minutes or 4.25 hours.
effectiveness is high; e.g., plan to perform fewer
Z Inspection rate = size / total inspection hours
Inspections
In our example, Inspection rate = 15 pages / 4.25
hours = 3.5 pages / hour. Still viewed as not-applicable in most
Defect finding efficiency = Total defects found / total organizations
inspection hours. In our example, Inspection rate = Still not fun, but still best price performer for
11 defects / 4.25 hours = 2.6 defects/hour 153 removing defects 154

You might also like