Professional Documents
Culture Documents
Test Reporting Process: Skill Category 6
Test Reporting Process: Skill Category 6
Process
Skill Category 6
Contents
Prerequisites to Test Reporting pg 321
Test Tools used to Build Test Reports pg 331
Test Tools used to Enhance Test Reporting
pg 351
Reporting Test Results pg 356
Prerequisites to Test
Reporting
Test reporting is where the value of testing shines
through
pg 321-322
Prerequisites…
Define and Collect Test Status
Data
Types of data needed for collection:
Test Results Data
Test factors
The factors incorporated into the plan
Business objectives
The validation that specific business objectives have been met.
Interface objectives
Validation that data/objects can be correctly passed among
software components
Functions and sub-functions
Pieces normally associated with requirements
Units
The smallest identifiable software components
Platform
The system environment for the application
And others not mentioned
pg 322
Prerequisites…
Define and Collect Test Status
Data
Other Types of data needed for collection:
Test Case Results and Test Verification Results
Test techniques used to perform testing
Test Cases – types of tests to be used
Inspections – verification pf process deliverables against
deliverable specifications
Reviews – verification process that deliverables\phases are
meeting user’s true needs
Defects
Data the defect uncovered, Name of the defect, location
of the defect, severity of the defect, type of defect, and
how the defect was uncovered.
Efficiency
For both the software system and the test process
pg 322-323
Prerequisites…
Define Test Metrics used in
Reporting
1. Establish a test metrics team
Members:
Should have a working knowledge of quality and
productivity measures
Are knowledgeable in the implementation of statistical
process control tools
Have a working understanding of benchmarking
techniques
Know the organization's goals and objectives
Are respected by their peers and management
Should be relative to the size of the organization (>=2)
Should have representatives from management,
development, and maintenance projects
pg 323-324
Prerequisites…
Define Test Metrics used in
Reporting
2. Inventory exist IT measures
All identified data must be validated to determine if
they are valid and reliable.
Should kick off with a meeting
Introduce al members
Review scope and objectives of the inventory process
Summarize the inventory processes to be used
Establish communication channels to use
Confirm the inventory schedule with major target dates
Should contain the following activities:
Review all measures currently being captured and
recorded
Document all findings
Conduct interviews
pg 324
Prerequisites…
Define Test Metrics used in
Reporting
3. Develop a consistent set of metrics
Should be consistent across all product lines
3. Define Desired test metrics
Use the previous 2 tasks to define the metrics
for the test reporting process
Description of desired output reports
Description of common measures
Source of common measures and associated
software tools for capture
Definition of data repositories
pg 325
Prerequisites…
Define Test Metrics used in
Reporting
Develop and implement the process for collecting
measurement data
Document the workflow of the data pature and reporting
process
Procure software tools to capture, analyze, and report
the data
Develop and test system and user documentation
Beta-test the process using a small to medium-size
project
Resolve all management and project problems
Conduct training sessions for the management and
project personnel on how to use the process and
interrelate the reports
Roll out the test status process
Monitor the process
pg 325
Prerequisites…
Define Effective Test Metrics
Metric
A quantitative measure of the degree to which a system,
component, or process possesses a given attribute
Process Metric
A metric used to measure characteristics of the methods,
techniques, and tools employed in developing, implementing, and
maintaining the software system
Product Metric
A metric used to measure the characteristics of the documentation
and code
pg 326
Prerequisites…
Define Effective Test Metrics
Test status usually include:
Total number of tests
Number of tests executed to date
Number of tests passing
pg 326
Prerequisites…
Define Effective Test Metrics
Objective vs. Subjective Measures
Objective measure can be obtained by
counting
Subjective measure has to be calculated based
upon a person’s perception
pg 327
Prerequisites…
Define Effective Test Metrics
How do you know a Metric is good?
Reliability
If two people take the measure, will they get the same
results?
Validity
Does the metric really measure what we want to measure?
Ease of Use and Simplicity
Is it easy to capture?
Timeliness
Is the data available while the data is still relevant?
Calibration
Can the metric be changed easily in order to better meet
the other requirements?
pg 327
Prerequisites…
Define Effective Test Metrics
Standard Units of Measure
A measure is a single attribute of an entity
Must be defined before taking a measurement
Weighing factors must also be defined
Measurement program should have between 5 and 50
standard units of measure
Productivity vs. Quality
Quality is an attribute of a product or service
Productivity is an attribute of a process
Quality can drive productivity
Lower or not meet quality standards, some productivity
measures will likely increase
Improve the processes so that defects do not occur, some
productivity measures will likely increase
QAI likes this method better
pg 328
Prerequisites…
Define Effective Test Metrics
Test Metric Categories
It is useful to categorize metrics for use in status
reporting
pg 329
Prerequisites…
Define Effective Test Metrics
Commonly used Categories (cont)
Size Measurements
KLOC
Function Points
Pages or words of documentation
Defect Metrics
Defects related to size of software (#defects\KLOC)
Severity of defects
Priority of defects
Age of defects
Defects uncovered in testing
Cost to locate a defect
Product Measures
Defect density
pg 330
Prerequisites…
Define Effective Test Metrics
Commonly used Categories (cont)
Satisfaction Metrics
Ease of use
Customer complaints
Customer subjective assessment
Acceptance criteria met
User participation in software development
Productivity Metrics
Cost of testing in relation to overall project costs
Under budget/Ahead of schedule
Software defects uncovered after the software is place
into the operational status
Amount of testing using automated tools.
pg 330 - 331
Test Tools used to Build Test
Reports
Testers use many tools in analyzing the
results of tests and create the information
for test reports
pg 331
Test Tools …
Pareto Charts
A type of bar chart to view causes of a problem in
order of severity, largest to smallest
pg 331-334
Test Tools …
Pareto Charts
Deployment
1. Define the problem clearly
2. Collect data
3. Sort or tally data in descending order
4. Construct chart
5. Draw bars to correspond to sorted data in
descending order
6. Determine vital few causes
7. Compare and select major causes
pg 331-334
Test Tools …
Pareto Charts
Results
A necessary first step in continuous process
improvement
Graphically demonstrates the 20-80 Rule
Provides the ability to identify which problem
or cause to work on first by its severity or
impact
Recommendations
It is easy to understand, but it requires
discipline by management teams, facilitators,
and teams involved
pg 331-334
Test Tools …
Pareto Voting
For problems that can’t be quantified, use
Pareto Voting.
Based on the Pareto Principle
20% of the potential causes brainstormed will
usually be chose by 80% of the group
pg 334
Test Tools …
Pareto Voting
Deployment
1. Complete brainstorming for potential causes of a
problem
2. Determine the total number of brainstormed ideas and
multiply by 20%
3. Determine the number of votes teach team member
gets from the results of step 2
4. Each team member then votes to select the cause(s)
having the largest impact on the state problem
5. Tally votes (please don’t accept bribes)
6. Determine the plan of action to resolve these causes
pg 334
Test Tools …
Cause and Effect Diagrams
Useful for visualizing, linking, identifying,
classifying, and clarifying possible causes of a
problem
Tends to keep teams focused on the problem
Uses the 5 years old approach
Always ask “why?”
Don’t accept “because” as an answer
pg 335-337
Test Tools …
Cause and Effect Diagrams
Deployment
1. Identify a problem (effect) with a list of potential
causes
2. Construct a fishbone diagram with basic material –
flipchart, paper, whiteboard, etc.
3. Write the effect at the right side
4. Identify major causes of the problems
5. Use Results of brainstorm or affinity diagram to fill in
small branches
6. Complete process until lowest-level sub-cause is
identified
pg 335-337
Test Tools …
Cause and Effect Diagrams
Deployment (cont)
pg 335-337
Test Tools …
Cause and Effect Diagrams
Results
Provides a visual relationship between cause and effect
Breaks down problem into manageable group of root
causes that contribute most to a problem
Separates symptoms of a problem from the real causes
Provides interaction within a team to analyze problems
Recommendation
Use to analyze problem related to workplace or
processes owned by a team
pg 335-337
Test Tools …
Check Sheets
Check sheet is the technique or tool to
record the number of occurrences over a
specified interval of time
A data sample to determine the frequency of
an event.
pg 337-340
Test Tools …
Check Sheets
Deployment
1. Clarify what must be collected
2. Establish the format for the data collection
3. Understand the objectives to ensure accuracy of the
collection process
4. Establish the sample size and time frame of data
collection
5. Instruct or train data collectors for consistency
6. Observe, record, and collect data
7. Tally results, using Pareto chart or histograms
8. Evaluate results
pg 337-340
Test Tools …
Check Sheets
Results
Provide factual data to evaluate problems
Detects patterns in the process where the problems are
suspect
Recommendations
Use as a standard for problem solving whenever data is
available or can be collected to validate early what is
happening to a process or underlining a problem
pg 337-340
Test Tools …
Check Sheets
Advantages
Defines areas to discuss
Limits scope
Consistency
Organized approach
Disadvantages
Over reliance
Applicability
Limiting
pg 337-340
Test Tools …
Check Sheets
Suggestions for preparing checklists
Avoid bias
Mix questions by topic
Test questions prior to use
Allow for “I don’t know”
Can display
Central point - Average
Variation - Standard deviation, range
Shape of distribution – normal, skewed,
clustered
pg 340-342
Test Tools …
Histograms
Deployment
1. Gather data and organize from lowest to highest values
2. Calculate the range (r): largest to the smallest
3. Determine number of cells (k)
4. Calculate the interval or width (m) of the cells
1. m = r/k
5. Sort the data or observations into their respective cells
6. Count the data points of each cell to determine the
height of the interval
7. Create a frequency table
8. Plot the results
9. Distribution pattern from histograms: normal, double
peak,olated island, cliff, cogwheel, and skewed
pg 340-342
Test Tools …
Histograms
Results
Helps explain graphically if a process is out of
control
Provides a basis for what to work on first
Provides insight on the process capability to
meet end user specifications
Establishes a technique to measure a process
Analyze for improvement opportunities
Recommendations
Everyone should use the histogram technique
pg 340-342
Test Tools …
Run Charts
A graph of data in chronological order
displaying shifts or trends in the central
tendency
They track changes or trends in a process
as well as help to understand the
dynamics of a process.
pg 342-343
Test Tools …
Run Charts
Deployment
1. Decide which outputs of a process you need
to measure
2. Label your chart both vertically (quantity) and
horizontally (time)
3. Plot the individual measurements over time
4. Connect data points for easy use and
interpretation
5. Track data chronologically in time
6. Look for trends
pg 342-343
Test Tools …
Run Charts
Look for the following:
Unusual events
Those points out of the
range should be looked at
closer to determine if a
change should be made
permanent or deleted
Trend – 6 or more increasing\decreasing points show a
trend. Investigate!
Two processes – alternating data points above and below the
average show that there are caused by two groups, shifts, or
people
All special causes need to be investigated
pg 342-343
Test Tools …
Run Charts
Results
Monitor outputs of a process to better
understand what is happening
Provides a means to detect the shifts or trends
in a process
Provides input for establishing control charts
after a process has matured or stabilized in
time
Recommendations
Use to quantify and determine what is
happening in or with a process
pg 342-343
Test Tools …
Scatter Plot Diagrams
Shows the relationship that may exist between
two variables or factors
Can test for possible cause and effect diagrams
AKA
Correlation diagrams (positive, negative, or zero)
pg 343-349
Test Tools …
Scatter Plot Diagrams
Deployment
1. Select the variable and response relationship to be
examined by the team
2. Gather data on variable and response; determine
sample size or paired data
3. Plot the results; determine appropriate scale to plot
the relationship
4. Circle repeated data points as many times as they
occur
Possible correlations
Positive, negative, random, linear, curvilinear, or
cluster
pg 343-344
Test Tools …
Scatter Plot Diagrams
pg 344
Test Tools …
Scatter Plot Diagrams
Exploratory Analysis
Exploratory graphics are used to identify
conditions for additional investigations
Does not need a large sample size, but also
cannot be used for predictions about the
population
Cluster Analysis
Regression Analysis
Multivariate Analysis
pg 344-345
Test Tools …
Cluster Analysis
Used to identify groups or clusters of items, which
are in close proximity of one another on the
scatter plot
Once a cluster is identified, the analyst could now
use it to determine if there are any reason why
they would be clustered
Useful in:
Identifying common characteristics of subcategories of
events
Multivariable relationships
Identification of items that should be of concern to
management, but they may not have realized the
quantity of items in the cluster
Items to sample because they fall outside expected
ranges (those outside the cluster)
pg 345-347
Test Tools …
Cluster Analysis
Results
Gives analysis between two measurement
variables in a process
Provides a test of two variables being changed
to improve a process or solve a problem
Helps to recover real causes, not symptoms, of
a problem or a process
Recommendations
None
pg 345-347
Test Tools …
Regression Analysis
Will provide two pieces of information
A graphic showing the relationship between
two variables
It will show the correlation, or how closely
related the two variables are
Is used in analysis to show cause and
effect relationships between two variables.
Is used in conjunction with analysis to
prove to management the point
pg 347-348
Test Tools …
Regression Analysis
Deployment
Usually requires a lot of math
The analyst may want to pick 2 or 3 variables
in order to show regression.
The analyst needs to calculate the standard
error
Recommendations
None
pg 347-348
Test Tools …
Multivariate Analysis
Begins with single variables being plotted
Correlation may be present
pg 348-349
Test Tools …
Multivariate Analysis
Deployment
Normally requires advanced statistical
packages to perform the underlying analysis
Regression statistics most often are used to produce
the information needed for creating the multivariable
graph
pg 348-349
Test Tools …
Multivariate Analysis
Can be used
Only after individual variables have been
carefully examined
When scatter plots of the individuals have been
made and the information smoothed in order
to examine or plot the individual variables
pg 348-349
Test Tools …
Control Charts
A statistical technique to assess, monitor,
and maintain the stability of a process.
The intent is to monitor the variation of a
statistically stable process where activities
are repetitive.
Types of variation being observed:
Common (random)
Special (unique events)
Meant to be used on a continuous basis to
monitor processes
pg 349-351
Test Tools …
Control Charts
Typically used when the process seems to be out
of control
Deployment
1. Identify characteristics of process to monitor
2. Select the appropriate type of control chart based on
characteristics to monitor
3. Determine methods for sampling – use check sheets
4. Collect the sample data
5. Analyze and calculate sample statistics
6. Construct control chart based on statistics
7. Monitor process for common and special cuases
8. Evaluate and analyze any observations outside the limits
for causes related to the situation
9. Investigate unusual patters when observations have
multiple runs above or below the average
pg 349-351
Test Tools …
Control Charts
Results
Objectively defines a process and variation
Establishes measures on a process
Improves process analysis and opportunities
Process improvements are based on facts-
managed by facts
pg 349-351
Test Tools used to Enhance
Test Reporting
The more useful the information in test
reports, the better the view of testing
pg 351
…Enhance Reporting
Benchmarking
Is the continuous process of measuring
our products, services, and practices
against our toughest competitors, or those
companies recognized as world leaders.
The purpose is to achieve process
improvement, measurement, motivations,
and a management process for
improvement.
pg 351-355
…Enhance Reporting
Benchmarking
Gain a better awareness of yourself:
What you are doing and how
How well you are doing it
Gain a better awareness of “the best”:
What they are doing and how
How well they are doing it
Identify the performance gap
Understand how to change business processes to
improve quality, cost, and delivery
Types:
Performance
Process
Product
pg 351-355
…Enhance Reporting
Benchmarking
A 10 Step process to Collect Benchmark Data
1. Identify benchmarking subject and teams
2. Identify and select benchmarking partners
3. Collect Data
4. Determine current competitive gap
5. Project the future performance levels
6. Communicate findings and gain acceptance
7. Establish Functional Improvement goals
8. Develop an Action Plan
9. Implement plans and monitor progress
10. Recalibrate and reset benchmark performance levels
pg 351-355
…Enhance Reporting
Benchmarking
Deployment
Competitive Benchmarks
Compare your business performance within the IT
industry
Functional Benchmarks
Compare your business performance with that of the
best in the industry
Internal Benchmarks
Compare business performance with that of other
company units
pg 351-355
…Enhance Reporting
Benchmarking
Results
An assessment of how good your are against industry leaders
An indication of an improvement goal that is realistic based on industry
practices
Insight into how to achieve improvement from analysis of
benchmarking partners’ processes
Four lessons:
It’s important to focus on a specific objective and process
Breadth for context; depth for understanding
Facilitation of the benchmarking session to keep on track
It’s key to prepare in advance
Objectives, agenda, date, attendees
Meeting and benchmarking protocol
Process documentation
It’s not easy to identify the IT “best of the best”
Good performance data is not readily available
Research is required to evaluate opinion versus fact
It always takes longer than you think
pg 351-355
…Enhance Reporting
Quality Function Deployment
Aimed specifically at satisfying the end
user
Focuses on delivering value by
understanding what the user wants and
needs and then deploying these
expectations downstream in a visible way
pg 355-356
…Enhance Reporting
Quality Function Deployment
End-user deployment
Involves the determination of the types of end-
users the product will focus on
This is the first step when starting a project –
even before requirements deployment
Quality deployment
Uses tools and techniques for the exploration
and the specification of high-value end-user
requirements
Once captured, the requirements then get
translated and deployed into technical
deployments
pg 355-356
Reporting Test Results
Should be a continuous process
In preparing test reporters, answer:
What information do the stakeholders need?
How can testers present that information in an easy-to-
understand format?
How can I present the information so that it is
believable?
What can I tell the stakeholder that would help in
determining what action to take?
Test reporting needs to be sent at pre-defined
place using current status reports and at the end
of testing using the final test report
pg 356-357
Reporting Test Results
Current Status Test Reports
Need to show the status of testing
These will be used by the testers, test
manager, and the software development
team
13 examples follow
pg 357-374
Current Status Reports
Function Test Matrix
Shows which tests must be performed to
validate the functions
A low level report that shows the results of each
test
Intersections can be color coded or with a
number to show status
1. Test is needed, but not performed
2. Test is currently being performed
3. Test was performed and a minor defect noted
4. Test was performed and a major defect noted
5. Test complete and function is defect-free for the
criteria included in this test
pg 358
Current Status Reports
Defect Status Report
One is needed for each defect found by
testers
Data can be simple or complex, but these
are the minimum recommended:
Defect naming
Defect Severity
Defect Type
It is to describe the defect and give the
current status of that defect
pg 359-360
Current Status Reports
Functional Testing Status
Report
Should contain the percent of:
Functions that have been fully tested
Functions that have been tested, but contain
errors
Functions that have not been tested
It is to show the progress of testing
How many concerns are generated by this
report depend upon what stage the application
is in
pg 360-361
Current Status Reports
Functions Working Timeline
Used to determine if test and development
are on schedule
pg 361-362
Current Status Reports
Expected versus Actual Defects Uncovered
Timeline
Used to show if the number of defects uncovered
is above or below the expected number
Assumptions:
The development process is sufficiently stable so that
the defect rates from the process are relatively
consistent
The organization has sufficient historical data to project
defect rates
Any varients should be looked into
Too many – developers are inexperienced, or you have
a very good\efficient test team that knows how to spot
defects early in the process
Too few – Testers are writing the wrong types of tests,
or you have a very good\efficient set of developers that
know how to code (very rare)
pg 362-363
Current Status Reports
Defects Uncovered versus Corrected Gap
Timeline
This is used to list the backlog of
uncorrected defects that have been
reported
pg 363 - 364
Current Status Reports
Average Age of Uncorrected Defects by Type
Used to show the breakdown of the gap
from Defects Uncovered versus Corrected Gap Timeline Report
Organizations should have guidelines for
how long defects at each level should be
allowed in the system, before they are
corrected
pg 364-365
Current Status Reports
Defect Distribution Report
Used to explain how defects are
distributed among the modules/units
being tested
A variation could be what test found the
defects
Functions with high defect count normally
have an architecture issue
pg 365-366
Current Status Reports
Relative Defect Distribution
Report
Used to normalize the defect distribution
presented
Can be by function points or lines of code
Permits comparison of defect density among
the modules/units
pg 366-367
Current Status Reports
Testing Action Report
Used to summarize the action report
prepared by the test team
Helps managers see the state of testing
and defect fixes
pg 367-368
Current Status Reports
Individual Project Component Test
Results
Generated when a tester has its testing
completed for a component
pg 368-369
Current Status Reports
Summary Project Status Report
Used to provide general information about
the application and uses graphics to show
the status for each component
Should contain:
Report Date Information
Project Information
Timeline Information
Legend Information
pg 368-370
Current Status Reports
Individual Project Status Report
Provides information for a specific project
component
Should contain:
Project Information
General Project Information
Project Activities Information
Timeline for measuring each phase
Future dates for expected completion
Essential Elements Information
Legend Information
Project Highlights Information
pg 371-374
Reporting Test Results
Final Test Reports
Should be prepared after each level of testing
May include Unit, Integration, System, and Acceptance test reporting
Should report the results of testing as defined by the test plan
Should contain:
Definition of the scope of testing
Test results
Conclusions and recommendations
Objectives:
Inform the developers what works and what does not work
Provide information to the users of the software system so that they
can determine whether or not the system is ready for production
After implementation, help the project trace problems in the event that
the application breaks in production
Use the test results to analyze the test processes to make it run
smoother
pg 374-375
Final Test Reports
Unit Test Report
Unit is usually tested by the developer
pg 375-376
Final Test Results
Integration Test Report
Follows the same format as unit test
report, except it focuses on the interfaces
pg 376-377
Final Test Reports
System Test Report
pg 377-378
Reporting Test Results
Guidelines for Report Writing
Develop a baseline
If the report is consistent across the enterprise, it will be
easier to draw comparisons in the company
These comparisons can better point out what worked and what
didn’t
Use good report writing practices
Allow team members to review a draft before it is finalized
Don’t include names or assign blame
Stress quality
Limit the report to two or three pages stressing important
items – other information can be included in appendices
Eliminate small problems from the report and give these
directly to the project people
Hand-carry the report to the project leader
Off to have the testers work with the project team to explain
their findings and recommendations
pf 378-379