SQA-ppt II Unit

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 54

University of Paderborn

Software Engineering Group

Software Quality
Assurance:
III Quality Control
Dr. Holger Giese
Software Engineering Group
Room E 3.165
Tel. 60-3321
Email: hg@upb.de
University of Paderborn
Software Engineering Group

Outline
I Introduction
II Software Life Cycle
III Quality Control
IV Infrastructure
V Management
VI Standards
VII Conclusion & Summary

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-2
University of Paderborn
Software Engineering Group

III Quality Control


III.1 Foundations
III.4 Static Analysis Techniques
III.3 Dynamic Analysis Techniques
III.4 Testing Process & Activities
III.5 Management & Automation
III.6 Paradigm & Domain Specific Aspects
III.7 Discussion & Summary
III.8 Bibliography
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-3
University of Paderborn
Software Engineering Group

III.1 Foundations
„ Quality Control is the series of
inspections, reviews and tests used
throughout the development cycle to
ensure that each work product meets the
requirements placed upon it.
[Pressman2004]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-4
University of Paderborn
Software Engineering Group

Verification & Validation


„ Verification – The process of evaluating a
system or component to determine whether
the products of a given development phase
satisfy the conditions imposed at the start of
that phase
„ Validation - The process of evaluating a
system or component during or at the end of
the development process to determine
whether it satisfies specified requirements
[IEEE Std 610.12-1990]
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-5
University of Paderborn
Software Engineering Group

Verification & Validation


Verification: refers to the set of activities that ensure that software
correctly implements a specific function.

Validation: refers to a different set of activities that ensure that the


software that has been built is traceable to customer requirements.

Boehm [Boehm81]:
„ Verification: “Are we building the product right?”
„ Validation: “Are we building the right product?”

The definition of V&V encompasses many of SQA activities, including


„ formal technical reviews, quality and configuration audits
„ performance monitoring, different types of software testing
„ feasibility study and simulation

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-6
University of Paderborn
Software Engineering Group

Terminology: Dynamic/Static Analysis


Dynamic analysis:
„ The process of evaluating a system or component based on
its behavior during execution.
Contrast with: static analysis.
See also: demonstration; testing.
[IEEE-610.12-1990]

Static analysis:
„ The process of evaluating a system or component based on
its form, structure, content, or documentation.
Contrast with: dynamic analysis.
See also: inspection; walk-through.
[IEEE-610.12-1990]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-7
University of Paderborn
Software Engineering Group

Terminology: Testing
Several definitions:

“Testing is the process of executing a program or system with the intent of finding errors.”
[Myers1979]

Testing is the process used to verify or validate a system or its components.


[Storey1996]

Testing (according to IEEE):


(1) The process of operating a system or component under specified conditions, observing or
recording the results, and making an evaluation of some aspect of the system or
component.
(2) (IEEE Std 829-1983 [5]) The process of analyzing a software item to detect the
differences between existing and required conditions (that is, bugs) and to evaluate the
features of the software items. See also: acceptance testing; benchmark; checkout;
component testing; development testing; dynamic analysis; formal testing; functional
testing; informal testing; integration testing; interface testing; loopback testing; mutation
testing; operational testing; performance testing; qualification testing; regression testing;
stress testing; structural testing; system testing; unit testing.
[IEEE-610.12-1990]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-8
University of Paderborn
Software Engineering Group

Dynamic vs. Static Analysis


Dynamic analysis (testing):
„ execution of system components;
„ running the software

Static analysis:
„ investigation without operation;
„ pencil and paper reviews etc.
„ Modelling (mathematical representation)

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-9
University of Paderborn
Software Engineering Group

Quality Control Activities & Process


Life cycle phase Dynamic Static Modelling
analysis analysis
Requirements X X
Top-level design X X
Detailed design X X
Implementation X X
Integration testing X X X
System validation X X

[Storey1996]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-10
University of Paderborn
Software Engineering Group

III.2 Static Analysis Techniques


Overview
„ Reviews and Inspections
† Walkthroughs, inspections, personal
reviews
† Formal technical reviews
† Summary
„ Other Techniques
† Control-flow analysis, data-flow analysis,
metrics, …

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-11
University of Paderborn
Software Engineering Group

II.2.1 Reviews and Inspections


„ A family of techniques
(1) Personal reviews
(2) Inspections
(3) Walkthroughs
(4) Formal technical reviews
„ Review / inspect
† To examine closely
† With an eye toward correction or appraisal
„ People (peers) are the examiners [SWENE]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-12
University of Paderborn
Software Engineering Group

Purpose/Objectives [SWENE]

„ Verify that
† software meets its requirements
† software follows predefined standards
† software is developed in uniform manner
„ Catching errors
† Sooner
† More and different
† Breaking frame of reference
„ Make projects more manageable
† To identify new risks likely to affect the project
„ Improving communication
† Crossing organization boundaries
„ Providing Education
„ Making software visible

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-13
University of Paderborn
Software Engineering Group

(1) Personal Review [SWENE]

„ Features
† Informal
† Done by the producer
„ Implications
† Not objective
† Available to any developer
† Different mindset
„ Need for review Limited screening efficiency!
„ Product completion

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-14
University of Paderborn
Software Engineering Group

(2) Inspections [SWENE]

„ Features
† Team reviews materials separately
† Team and producers meet to discuss
† May review selected product aspects only

„ Implications
† Focus on important issues
„ If you know what they are
† More material per meeting
† Less preparation time

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-15
University of Paderborn
Software Engineering Group

Process and Participants [Galin2004]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-16
University of Paderborn
Software Engineering Group

(3) Walkthroughs [SWENE]

„ Features
† Lessformal
† Producer presents or provides information

„ Implications
† Larger groups can attend (education)
† More material per meeting
† Less preparation time

„ Disadvantage: Harder to separate


† Productand presenter
† Explanation and justification

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-17
University of Paderborn
Software Engineering Group

Process and Participants [Galin2004]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-18
University of Paderborn
Software Engineering Group

What to omit and what to include? [Galin2004]

Inclusion: Omission:
„ “Straightforward” sections
„ Sections of (no complications)
complicated logic „ Sections of a type already
reviewed by the team in
„ Critical sections, similar past projects
where defects „ Sections that, if faulty, are
not expected to effect
severely damage functionality
essential system „ Reused design and code
„ Repeated parts of the
capability design and code
„ Sections dealing
with new
environments
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-19

„ Sections designed
University of Paderborn
Software Engineering Group

(4) Formal Technical Review [SWENE]

„ Features
† Formal
„ Scheduled event
„ Defined procedure
„ Reported result
† Independent review team
„ Producers not present
„ Implications
† More preparation time
† Less material per meeting
† Product must stand or fall on its See also: IEEE Standard for
own Software Reviews and Audits
[IEEE 1028, 1988]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-20
University of Paderborn
Software Engineering Group

The Players [SWENE]

Managers

Review Team Producer


Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-21
University of Paderborn
Software Engineering Group

Team Selection [SWENE]

„ Manager assigns
† Vestedinterest in a good outcome
† Review as delegation of manager’s responsibility

„ Technical competence
† Current technology
„ Objectivity
† Best buddies and “outsiders”
„ User involvement

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-22
University of Paderborn
Software Engineering Group

Team Size [SWENE]

„ Smaller for
† Focus
† Scheduling

„
† Reasonable

Larger for
output volume
per person-hour 3 6
† Expertise
† Making review public
„ Non-participating
observers
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-23
University of Paderborn
Software Engineering Group

Team Characteristics
„ Experienced senior technical staff
„ Representatives of
†team that created the document
†client representative
†team for next development phase
†software quality assurance group

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-24
University of Paderborn
Software Engineering Group

Managers’ Participation [SWENE]

„ “Review is a manager’s job”


„ Problems:
† Technical competence
† Review of product vs. review of person
„ Active participation
† As an “Outsider” only!
† As a team leader (and outsider)
„ providing general competence
„ Post facto participation
† Review materials
† Review report

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-25
University of Paderborn
Software Engineering Group

What and When to Review [SWENE]

„ Any software artifact


† requirements, designs, code, documentation,
procedures, interfaces, ...
„ Design for review
† Controlling product complexity
† Controlling review length:
2 Hours

„ Scheduling reviews
10 AM
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-26
University of Paderborn
Software Engineering Group

Review Process [SWENE]

„ Producers provide materials


„ Leader schedules meeting
„ Individuals prepare
„ Team holds review meeting
„ Manager gets report

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-27
University of Paderborn
Software Engineering Group

Team Task Overview [SWENE]

„ Provide a good review


† The team is responsible for the review, not the
product (Don’t shoot the messenger)
„ Find issues: Raise them, don’t solve them
„ Render an assessment decision
† Accept,Accept with minor revision, Revision
needed, Reject
† Unanimous approval required
„ Product rejection by individual veto

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-28
University of Paderborn
Software Engineering Group

The Review Team [SWENE]

Leader Reviewers Recorder

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-29
University of Paderborn
Software Engineering Group

Team Leader - Traits [SWENE]

„ Technical competence
† General strength
† Credibility
† Able to understand the issues

„ Personal skills
† Willing to confront people
† Willing to report failure
† Able to step back from the heat of discussion

„ Administrative skills
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-30
University of Paderborn
Software Engineering Group

Team Leader - Characteristics [Galin2004]

„ Review leader should be SQA representative


† has the most to lose
† creator: eager to get approval (to start next job)
† client: can wait for acceptance testing
„ Knowledge and experience in development of projects of
the type reviewed. Preliminary acquaintance with the
current project is not necessary.
„ Seniority at a level similar if not higher than that of the
project leader.
„ A good relationship with the project leader and his team.
„ A position external the project team

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-31
University of Paderborn
Software Engineering Group

Team Leader - Tasks [SWENE]

„ Avoid premature reviews


„ Coordinate arrangements
† Materialsdistribution
† Meeting schedule
† Meeting location and facilities

„ Ensure a good review


† Or report the reason for failure
„ Materials missing
„ Reviewers missing or not prepared

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-32
University of Paderborn
Software Engineering Group

Team Leader - Run the Meeting


[SWENE]
„ Act as chairperson
† Opening and introductions
† Procedure guide
† Closing

„ Act as facilitator
† Controlling level of participation
„ Enough but not too much
† Conflict resolution
„ Terminate the meeting if unproductive
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-33
University of Paderborn
Software Engineering Group

Reviewers - Tasks [SWENE]

„ Prepare before
† Thorough review of materials
„ Participate
† Be there
„ Coming late; leaving early
† Act professionally
„ Personal agendas
„ Big egos and shyness

† Positive and negative comments


„ Balance; courtesy; preserving what’s good
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-34
University of Paderborn
Software Engineering Group

Reviewer - Preparation
„ Be prepared - evaluate product before review
„ be sure that you understand the context
„ first, skim all the product material to understand
location and format of the information
„ next, read product material and annotate hardcopy
„ avoid issues of style
„ inform the review leader if you can’t prepare

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-35
University of Paderborn
Software Engineering Group

Reviewers – Participation
„ Review the product, not the producer
„ keep your tone mild, ask questions instead
of making accusations
„ stick to the review agenda
„ pose your written comments as questions
„ raise issues, don’t resolve them!
„ limit discussions (do them off line!)
„ avoid discussions of style - stick to technical
correctness
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-36
University of Paderborn
Software Engineering Group

Recorder [SWENE]

„ Selection
† Any competent reviewer
† Single or multiple recorders
† Rotating responsibility within a meeting
† Leaders as recorders
„ Having too much to do
„ Separation of power

„ Task: Get it in writing


† Basis for report

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-37
University of Paderborn
Software Engineering Group

Recording Medium [SWENE]

„ Issues
† PublicVs. private notes
† Speed and accuracy
† Usefulness after the meeting

„ Media
† Flipcharts; posting prior pages
† Blackboards, overheads, PC and projector
† Video and audio recording

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-38
University of Paderborn
Software Engineering Group

Managers - Tasks [SWENE]

„ Stay out of reviews in your own area


„ Support reviews
† Talkabout it
† Provide resources
„ Time, the right people, place, materials
† Change the reward system
„ Abide by the review results

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-39
University of Paderborn
Software Engineering Group

Process [Galin2004]

Review session agenda:


„ A short presentation of the design
document.
„ Comments made by members of the
review team.
„ Verification and validation of
comments is discussed to determine
the required action items
(corrections, changes and
additions).
„ Decisions about the design product
(document), which determines the
project's progress.

Post review activities:


„ Preparation of the report.
„ Follow up performance of the
corrections and to examine the
corrected sections.

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-40
University of Paderborn
Software Engineering Group

Review Report - Purpose [SWENE]

„ Purpose
† Tellmanagers the outcome
† Early warning system for major problems
† Provide historical record
„ For process improvement

„ For tracking people involved with projects

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-41
University of Paderborn
Software Engineering Group

Review Report - Contents


„ A summary of the review discussions.
„ Decision about the product
† accept without further modification
† reject the work due to severe errors (review must be repeated)
† accept with minor modifications (that can be incorporated into the
document by the producer)
„ A full list of the required action items — corrections,
changes and additions.
† For each action item, completion date and project team member
responsible are listed.
„ The name(s) of the review team member(s) assigned to
follow up.
„ All participants have to sign-off
† shows participation Ö responsibility
† shows their concurrence with the findings

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-42
University of Paderborn
Software Engineering Group

Review Guidelines (1/2)


[Galin2004] from Pressman
Design Review Infrastructure
„ Develop checklists for common types of design
documents.
„ Train senior professionals serve as a reservoir for
review teams.
„ Periodically analyze past reviews effectiveness.
„ Schedule the reviews as part of the project plan.

The Design Review Team


„ Review teams size should be limited, with 3–5
members being the optimum.

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-43
University of Paderborn
Software Engineering Group

Review Guidelines (2/2)


[Galin2004] from Pressman
The Design Review Session
„ Discuss professional issues in a constructive way refraining
from personalizing the issues.
„ Keep to the review agenda.
„ Focus on detection of defects by verifying and validating the
participants' comments. Refrain from discussing possible solutions.
„ In cases of disagreement about an error - end the debate by
noting the issue and shifting its discussion to another forum.
„ Properly document the discussed comments, and the results of
their verification and validation.
„ The duration of a review session should not exceed two hours.

Post-Review Activities
„ Prepare the review report, including the action items
„ Establish follow-up to ensure the satisfactory performance of
all the list of action items

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-44
University of Paderborn
Software Engineering Group

Comparison [Galin2004]

Properties Design review Inspection Walkthrough


Overview meeting No Yes No
Participant’s Yes - thorough Yes - thorough Yes - brief
preparations
Review session Yes Yes Yes
Follow-up of Yes Yes No
corrections
Formal training of No Yes No
participants
Participant’s use of No Yes No
checklists
Error-related data Not formally Formally required Not formally
collection required required
Review Formal design 1) Inspection session
documentation review report findings report
2) Inspection session
summary report

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-45
University of Paderborn
Software Engineering Group
TYPES OF REVIEWS IEEE STD 1028-1988
Category/Attribute Management Technical Inspection Walkthrough
Objective Ensure Progress. Evaluate conformance to Detect and identify Detect defects.
Recommend corrective specifications and plans. defects. Examine alternatives.
action. Ensure proper Ensure change integrity. Verify resolution. Forum for learning.
allocation of resources.
Delegated Controls
Decision making Management team charts Review team petitions Team chooses All decisions made by
course of action. management of technical predefined product producer. Change is the
Decisions leadership to act on dispositions. Defects prerogative of the
are made at the meeting recommendations. must be removed. producer.
or as a result of
recommendations.
Change verification Change verification left to Leader verifies as part of Moderator verifies rework.Change verification left to
other project controls. review report. other project controls.
Group Dynamics
Recommended size 2 or more people 3 or more people 3-6 people 2-7 people
Attendance Management, technical Technical leadership and College of peers meet Technical leadership and
leadership, and peer mix. peer mix. with documented peer mix.
attendance.
Leadership Usually the responsible Usually the lead engineer. Trained moderator. Usually producer.
manager.
Procedures Moderate to high, Moderate to high,
Material volume depending on the specific depending on the specific Relatively low. Relatively low.
“statement of objectives” “statement of objective”
at the meeting. for the meeting.
Presenter Project representative Software element Presentation by “reader” Usually the producer
representative. other than producer.
Data collection As required by applicable Not a formal project Formally required. Not a formal project
policies, standards, or requirement. May be requirement. May be
plans. done locally. done locally.
Outputs
Reports Management review Technical review report. Defect list. Defect Walkthrough report.
report. summary. Inspection
report.
Database entries Any schedule changes No formal database Defect counts, No formal database
must be entered into the required. characteristics, severity, required.
project tracking database. and meeting attributes
are kept.

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-46
University of Paderborn
Software Engineering Group

Formality of Technical Reviews


Formal Review

Inspections “Proof of
correctness”

Walkthroughs
“Detection of flaws”
Personal Review team-oriented

“Improved design”
individual initiative

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-47
University of Paderborn
Software Engineering Group

When to Involve External Experts? [Galin2004]

„ Insufficient in-house professional capabilities


in a specialized area.
„ Temporary lack of in-house professionals for
review team.
„ Indecisiveness caused by major
disagreements among the organization’s
senior professionals.
„ In small organizations, where the number of
suitable candidates for a review team is
insufficient.
Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-48
University of Paderborn
Software Engineering Group

Testing Efficiency [SWENE]

„ Catching most errors before test


† Review plus test is much cheaper than just test
† Sample results:
„ 10x reduction in errors reaching test
„ 50 - 80 % total cost reduction
„ Fewer defects after release
† Substantial cost savings in maintenance

Composite data from H-P (R. Grady)


„ Testing efficiency (defects found / hour)
† System use .21
† Black box .282
† White box .322
† Reading/inspect. 1.057

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-49
University of Paderborn
Software Engineering Group

Effectiveness of Inspections
„ [Fagan 1976] inspections of design & code
† 67%-82% of all faults were found by inspections
† 25% time saved on programmer resources (despite
inspections)
„ [Fagan 1986]
† 93% of all faults were found by inspections
„ Cost reduction for fault detection
(compared with testing)
† [Ackerman+1989]: 85%
† [Fowler1986]:90%
† [Bush1990]: 25.000US$ saved PER inspection

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-50
University of Paderborn
Software Engineering Group

Code Inspection Effectiveness


Year Defect detection method Defects per
1000 lines of
Test Design Code maintained
review inspection
% % % code
1977 85 --- 15 0.19
1978 80 5 15 0.13
1979 70 10 20 0.06
1980 60 15 25 0.05
1981 40 30 30 0.04
1982 30 40 30 0.02
[Galin2004] from [Cusomano1991]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-51
University of Paderborn
Software Engineering Group

Design Reviews vs. Code Reviews


Case study: 700 KLOC real-time project,
over 400 developers, modern language,
three year duration.
Time for correction (hours):
cost saved 7.5 design
Cost Effectiveness = cost consumed 6.3 code
11.6 testing

Design Reviews Code Reviews Testing


8.44 1.38 0.17

[Collofello&Woodfield1989]

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-52
University of Paderborn
Software Engineering Group

Summary [SWENE]

„ Highly effective technique


„ Low technology
„ Not used nearly enough

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-53
University of Paderborn
Software Engineering Group

Further Readings
[Ackerman+1989] A.F. Ackerman, L.S. Buchwald, F.H. Lewski, “Software inspections: an effective verification process”, IEEE
Software 6 (May 1989), pp. 31-36
[Bush 1990] M. Bush, “Improving software quality: the use of formal inspections at Jet Propulsion laboratory”
Proceedings of the 12th International Conference on Software Engineering, Nice, France, March 1990, pp.
196-199
[Collofello&Woodfield1989] J.S. Collofello and S. N. Woodfield. “Evaluating the Effectiveness of Reliability-Assurance
Techniques.” Journal of Systems and Software, March 1989.
[Cusumano1991] M. A. Cusumano. Japan‘s Software Factories – A Challange to u.S. Management. Oxford University
Press, New York, 1991.
[Dobbins1998] J. H. Dobbins. Inspections as an up-front quality technique. In G. G. Schulmeyer and J. i. McManus (eds),
handbook of Software Quality Assurance, Prentice Hall, Harlow, Essex, UK, 1998.
[Doolan1992] Doolan, E. P. “Experience with Fagan’s Inspection Method,” Software-Practice and Experience, 22, 2,
February 1992: 173-182.
[Fagan 1976] M.E. Fagan, “Design and code inspections to reduce errors in program development”, IBM Systems
Journal 15 (No. 3, 1976), pp. 182-211
[Fagan 1986] M.E. Fagan, “Advances in software inspections”, IEEE Transactions on Software Engineering, SE-12 (July
1986), pp. 744-751
[Fowler 1986] P.J. Fowler, “In-process inspections of workproducts at AT&T”, AT&T Technical Journal 65 (March/April
1986), pp. 102-112
[Kelly+1992] J. C. Kelly, J. S. Sherif, and J. Hops. “An Analysis of Defect Densities Found During Software
Inspections.” Journal of Systems and Software, 1992.
[Parnas&Weiss1985] Parnas, D. L. & Weiss, D. M. “Active Design reviews: Principles and Practices,” Proceedings of the 8th
Conference on Software Engineering, pp. 215-222, August, 1985.
[Parter+1994] Porter, A. A.; Votta, Jr., L. G.; & Basili, V. R. Comparing Inspection Methods for Software Requirements
Inspections: A Replicated Experiment, University of Maryland Technical Report (CS-TR-3327), 1994. [E&S
Tech Rept MDU 3327]
[Weller1993] E. F. Weller. “Lessons From Three Years of Inspection Data.” IEEE Software, Sept. 1993.

Dr. Holger Giese WS04/05 – Software Quality Assurance – III Quality Control III-54

You might also like