Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 67

Skill Category 1

• Vocabulary
• Why do we test software
• The multiple roles of the software tester
• Life cycle testing
• Test matrices
• Independent Testing
• Tester’s workbench
• Levels of testing
• Testing techniques
Vocabulary
• Quality Assurance vs Quality Control
• The Cost of Quality
• Software Quality Factors
• How Quality is Defined
What is Quality Assurance ?
• A focus on processes:
– Defining
– Deploying
– Continuously improving

The goal is defect prevention

• A staff position:
– Facilitator
– Coordinator
– Educator
– Quality planner
– Measurement analyst
What is Quality Control ?
• A focus on products
• Goal is defect detection
• A line position
• Product quality is the responsibility of those who
produce the product
• Some common QC processes are:
– Software testing
– Reviews
– Inspections
– Checklists
Cost of Quality
• Prevention costs
• Appraisal costs
• Failure costs

The cost of quality is all the costs that occur


beyond the cost of producing the product
“right the first time.”
Software Quality Factors
The software quality factors are attributes of the
software that, if they are wanted and not
present, pose a risk to the success of the
software, and thus constitute a business risk.

Correctness Reliability
Efficiency Integrity
Usability Maintainability
Testability Flexibility
Portability Reusability
Interoperability
How Quality is Defined
• 5 Perspectives of Quality
– Transcendent – I know it when I see it
– Product-based – Possesses desired features
– User-based – Fitness for use
– Development- and manufacturing bases –
Conforms to requirements
– Value-based – At an acceptable cost
Why Do We Test Software
• Developers are not good testers
• What is a defect – an undesirable state
• What is quality software
• Why does a development process produce
defects
• Reducing the frequency of defects in
software development
Developers are not Good Testers
• Misunderstandings will not be detected
• Improper use of the development process
may not be detected
• The individual may be “blinded” into
accepting erroneous system specifications
and coding
• Information services people are optimistic
in their ability to do defect-free work
What is Quality Software
• IT’s view of quality software means
meeting requirements

• User’s of software view of quality means


fit for use
Software Quality Gap (p 55)
Why Does the Development
Process Produce Defects
• Variability is the “enemy” of quality – the concepts behind
maturing a software development process is to reduce
variability
• Since the key to quality is process consistency, variation
(the lack of consistency) must be understood before any
process can be improved
• Statistical methods are the only way to objectively
measure variability
• Brian Joiner’s special causes of variation (p 58-60)
• The concept of statistical control allows us to determine
which problems are in the process (due to common
causes of variation) and which are external to the
process (due to special causes of variation)
• Special process control (SPC) is a measurement tool
The Five Levels of Process
Maturity
• Level 1 – Ad hoc
– Manage people, schedule driven, political
• Level 2 – Control
– Manage processes, results predefined, skills taught
• Level 3 – Core Competency
– Manage capabilities, customer focused, mentor
• Level 4 – Predictable
– Manage by fact, results predictable, coach
• Level 5 - Innovative
– Manage innovation, benchmark, continuous learning
The Multiple Roles of the Software
Tester
• People relationships
• Scope of testing – is the extensiveness of the test
process (Why are we testing?)
• When should testing occur – in every phase of
the life cycle
• How the test plan should be developed
• Testing constraints – anything that inhibits the
tester’s ability to fulfill their responsibilities is a constraint
(p 72-80)
Life Cycle Verification Activities
Life Cycle Phase Verification Activity
Requirements Determine verification approach
Determine adequacy of design
Generate functional test data
Determine consistency of design with
requirements
Design Determine adequacy of design
Generate structural and functional test
data
Determine consistency with design
Program(build/construction) Determine adequacy of implementation
Generate structural & functional test data
Test Test application system
Installation Place tested system into production
Maintenance Modify and retest
Develop the Test Plan
• Select and rank test objectives
• Identify the system development phases
• Identify the business risks associated with
the system under development
• Place risks in the matrix
Life Cycle Testing
• Involves continuous testing of the solution
even after the software plans are complete
and the tested system is implemented
• Cannot occur until you formally develop
your process
• Best accomplished by forming a test team
• Effectiveness of the test team depends on
developing the system under one
methodology and testing it under another
Test Matrices
• Shows the interrelationship between
functional events and tests (example p 81-82)
• Cascading test matrices – one functional
event leads to the creation of another,
which leads to another and so on (example
p84)
Independent Testing
• Independent test team is responsible for:
– System testing
– Oversight of acceptance testing
– Providing an unbiased assessment of the quality of an
application
• Independent test team comprised of:
– Test manager/team leader
– Team of testers
• Test team should be represented in all key
requirements and design meetings
Tester’s Workbench
• What is a process –
– set of activities that represent the way work is
performed
– Outcome of a process is a product or a
service
– Process example and outcomes on p 87
– PDCA p 87-88
– Workbench view of a process p 89
Levels of Testing
• Verification testing
• Unit testing
• Integration testing
• System testing
• User acceptance testing
• The “V” concept of testing p 91-92
• 11 step software testing process example
p 92-96
Testing Technique
• Structural vs functional technique
categories
• Verification vs validation
• Static vs dynamic testing
• Examples of specific testing techniques
• Combining specific testing techniques
• P 96-132
Skill Category 2
Building the Test Environment
• Management Support
• Test Work Processes
• Test Tools
• Testers Competency
Management Support
• Management tone
– Set by providing testers the resources and management time needed to do their
job effectively
– Creates a test function that is aligned with the business needs of the organization
• Integrity and ethical values
– Essential elements of the control environment
– Integrity is a prerequisite for ethical behavior
– Ethical behavior is good business
• Commitment to competence
– Should reflect the knowledge & skills needed to accomplish tasks that define the
individual’s job
• Management’s philosophy and operating style
– Affect the way testing is managed, including the kinds of business risks accepted
• Organizational structure
– Provides the framework within which its activities for achieving entity-wide
objectives are planned, executed, controlled and monitored
Test Work Processes
• Work processes
– Policy – provides direction
– Standards – are the rules or measures by which the implemented policies are
measured
– Procedure – means used to meet or comply with the standards
• The importance of work processes
– Improves communication
– Enables knowledge transfer
– Improves productivity
– Assists with mastering new technology
– Reduces defects and costs
• Responsibility for building work processes (policies, procedures and
standards p 142-149))
– Policies are developed by senior management
– Policies set direction but do not define specific products or procedures
– Policies are needed in areas that cause problems
– Policies define the areas in which processes will be developed
Test Work Processes (cont)
• Developing work processes
– A standard and procedure must be developed for developing
standards and procedures
– Establish a testing policy – good testing does not just happen, it
must be planned and a testing policy should be the cornerstone
of that plan
• Tester’s workbench
– Shows that input products drive the workbench, which uses
procedures and standards to produce output products
– The objective of the workbench is to produce the defined output
products in a defect-free manner
• Analysis and improvement of the test process (refer to p
153-167)
Test Tools
• It is difficult to perform testing
economically without the aid of automated
tools
• Tool development and acquisition ( p 167-
178
– A tool is a vehicle for performing a test
process
– A testing technique is a process for ensuring
that some aspect of an applications system or
unit functions properly
Test Tools (cont)
• Tool usage
– Automated regression testing
– Defect management
– Performance/load testing
– Manual
– Traceability
– Code coverage
– Test case management
– Common tools that are applicable to testing
• Guidelines
– Testers must have formal training for tool
– Use of test tools incorporated into test processes so use of the tool is
mandatory
– Access to an individual who can provide guidance on using the tool
Testers Competency
• Based on two criteria
– Skill sets possessed by that individual
– How those skills are applied to real-world test
situations
Skill Category 3
Managing the Test Project
• Test Administration
• Test Supervision
• Test Leadership
• Managing Change
Test Administration
• Test planning - Refer to skill category 4
• Customization of the test process
– Adding new test tasks
– Deleting some test tasks currently in the test process
– Adding or deleting test tools
– Supplementing skills of assigned testers to assure the tasks in the test
process can be executed correctly
• Budgeting – factors to take into consideration
– Size
– Requirements
– Expertise
– Tools
• Budgeting techniques
– Top-down estimation
– Expert judgment
– Bottom-up estimation
Test Administration (cont)
All goals should be smart goals.
SMART stands for:
Specific
Measurable
Agreed upon
Realistic
Time frame
Test Administration (cont)
• Scheduling
– Calendar-based breakdown of tasks and deliverables
– Answers these questions
• What tasks will be done
• Who will do them
• When will they do them
– Work Breakdown Structure (WBS) groups test project components into
deliverables and accountable pieces
• Staffing
– Identify the needed skills and then acquire members of the test project
who posses those skills
• Test team approaches (p 190-192)
– Developers become the test team
– Independent IT test team
– Non-IT test team
– Combination test team
Test Supervision
• Communication skills
– Written and oral communication
– Listening
– Interviewing
– Analyzing
• Negotiation and complaint resolution skills (p 203-
206)
• Judgment – decision based on 3 criteria
– Facts
– Standards
– Experience
Test Supervision (cont)
• Providing constructive criticism
– Do it privately
– Have the facts
– Be prepared to help improve performance
– Be specific on expectations
– Follow a specific process in giving criticism (p 207-8)
• Project relationships
– Are defined
– Roles of each party are defined
– Importance of the relationship to the success of the project is
defined
– Influence that a party can have on software testing needs to be
defined
Test Supervision (cont)
• Motivation – four common motivators
– Personal challenge
– Respect
– Rewards
– Recognition
• Mentoring – following 3 areas
– Career counseling
– Work tasks
– Professional advancement
• Recognition
– At a formal IT meeting
– Group luncheons/group celebrations
– Tokens of appreciation
– Time off if completing a task involved excess hours
– Lunch with the boss
Test Leadership
• Chairing meetings – p 211-212
• Team building
– Team development
– Team member interaction
– Team ethics
– Team rewards
• Quality management organizational structure
encourages leadership (chart on p 216)
• Code of ethics (p 217-218)
Managing Change
• Software configuration management
– All project artifacts need to be managed
• Source code Requirements
• Analysis models Design models
• Test cases and procedures Automated test scripts
• User documentation
• Hardware and software configuration settings
• Other artifacts as needed
• Change management – it is a process. Testers need to
know two aspects of change
– Characteristics of the change so the test plan & data can be
updated
– Version in which that change will be implemented
• Version control
Skill Category 4
Test Planning
• Risk concepts and vocabulary
• Risks associated with software
development
• Risks associated with software testing
• Risk analysis
• Risk Management
• Prerequisites to test planning
• Create the test plan
Vocabulary

• Test cases are how testers validate that a


software function or a structural attribute of
software meets the software specifications
• Test data is information used to build a test
case
• Test scripts are an online entry of test cases in
which the sequence of entering test cases and
the structure of the online entry system must be
validated, in addition to the expected results
from a single test case
Risk Concepts
• Risk is the potential loss to an organization
• Risk analysis is an analysis of an organization’s information
resources, its existing controls, and its remaining org and computer
system vulnerabilities
• Threat is something capable of exploiting vulnerability in the
security of a computer system or application
• Vulnerability is a design, implementation or operations flaw that
may be exploited by a threat to result in destruction or misuse of
equipment or data
• Control is anything that tends to cause the reduction of risk
Risks Associated with Software
Development
• Improper use of technology
• Repetition of errors
• Cascading of errors
• Illogical processing
• Inability to translate user needs into technical
requirements
• Inability to control technology
• Incorrect entry of data
• Concentration of data
• Inability to react quickly
• Inability to substantiate processing
Risks Associated with Software
Development (cont)
• Concentration of responsibilities
• Erroneous or falsified input data
• Misuse by authorized end users
• Uncontrolled system access
• Ineffective security and privacy practices for the
application
• Procedural errors during operations
• Program errors
• Operating system flaws
• Communications system failure
Detailed explanation P 227-238
Risks Associated with Software
Testing
• Primary testing risks include:
– Not enough training/lack of test competency
– Us vs them mentality
– Lack of test tools
– Lack of management understanding and support of testing
– Lack of customer and user involvement
– Not enough schedule or budget for testing
– Over reliance on independent testers
– Rapid change
– Testers are in a lose-lose situation
– Having to say “no”
– Test environment
– New technology
– New development process
Risks Associated with Software
Testing (cont)
• Premature release risk –releasing the
software into production under the
following conditions:
– The requirements were implemented
incorrectly
– The test plan has not been completed
– Defects uncovered in testing have not been
corrected
– The software released into production
contains defects
Risk Analysis
• Performing risk analysis during test
planning is 4-step process
– Form the risk analysis team
– Identify risks
– Estimate the magnitude of the risk
– Select testing priorities
Risk Management
• Risk appetite
– Defines the amount of loss management is willing to
accept for a given risk
• Risk reduction methods
– The formula to quantify risk is to multiply the
frequency of an undesirable occurrence times the loss
associated with that occurrence
• Contingency planning
– The role of testers is to evaluate the adequacy of the
contingency plans associated with the risk
Prerequisites to Test Planning
• Test objectives include testing to assure
– Software development project objectives are met
– Meeting requirements
– Fit for use
• Acceptance criteria
– Have the user define
• Assumptions – document
• People issues
– Who should run the project
– Who can make decisions
– Which organizational group has authority to decide requirements
• Constraints
– Test staff size
– Test schedule
– Budget
Create the Test Plan
• The test plan describes how testing will be
accomplished – should take about 1/3 of
the total test effort
• Tests in the test plan should be:
– Repeatable
– Controllable
– Ensure adequate test coverage
Understanding the Characteristics
of the Software Being Developed
• Define what it means to meet the project objectives
• Understand the core business areas and processes
• Assess the severity of potential failures
• Identify the components for the system
• Assure requirements are testable
• Address implementation schedule issues
• Address interface and data exchange issues
• Evaluate contingency plans
• Identify vulnerable parts of the system and processes
operating outside the information resource management
area
Build the Test Plan
• Set test objectives
– Define each objective so that you can reference it by number
– Write the test objectives in a measurable statement
– Assign a priority to the objectives
– Define the acceptance criteria for each objective
• Develop the test matrix
– Define tests as required
– Define conceptual test cases to be entered as a test script
– Define verification tests
– Prepare the software test matrix
• Define test administration – identifies
– Schedule
– Milestones
– Resources needed to execute the test plan
Write the Test Plan
• Guidelines to writing the test plan
– Start early
– Keep the test plan flexible
– Review the test plan frequently
– Keep the test plan concise and readable
– Calculate the planning effort
– Spend the time to do a complete test plan
Test Plan Standard
• Test scope
• Test objectives
• Assumptions
• Risk analysis
• Test design
• Roles and responsibilities
• Test schedule and resources
• Test data management
• Test environment
• Communication approach
• Test tools
P 262-268
Skill Category 5
Executing the Test Plan
• Test case design
• Building test cases
• Test coverage
• Performing tests
• Recording test results
• Defect management
Test Case Design
• Types of test cases
– Functional
• Design specific tests for testing code
– The goal is to test the specified behavior for each software feature,
including input and output
• Functional testing independent of the specification technique
– Functional testing derives test data from the features of the specs
• Functional testing based on the interface
– Input testing
– Equivalence partitioning
– Syntax checking
• Functional testing based on the function to be computed
– Special- value testing
– Output result coverage
• Functional testing dependent on the specification technique (p 272)
– Algebraic Axiomatic
– State machines Decision tables
Test Case Design (cont)
– Structural
• Structural analysis
– Complexity measures
– Data flow analysis
– Symbolic execution
• Structural testing
– Statement
– Branch
– Conditional
– Expression
– Path
Test Case Design (cont)
– Erroneous
• Statistical assessment – employs techniques to determine
the operational reliability of the program
• Error-based testing (p 276-278)
– Fault estimation Input testing
– Perturbation testing fault-based testing
– Local extent, finite Global extent, finite
– Local extent, infinite Global extent, infinite
– Stress or volume testing
• needs a tool that supplements test data
• Identify input data used by the program
• Identify data created by the program
• Challenge each data element for potential limitations
• Document limitations
Test Case Design (cont)
• Scripts (p 279-285)
– Determine testing levels
– Develop the scripts
– Execute the scripts
– Analyze the results
– Maintain the scripts
• Use cases
– Description of how a user or another system uses the
system being designed to perform a given task
– Build a system boundary diagram
– Define use cases (p 287-289)
Building Test Cases
• Process for building test cases
– Identify test resources
– Identify conditions to be tested
– Rank test conditions
– Select conditions for testing
– Determine correct results of processing
– Create test cases
– Document test conditions
– Conduct test
– Verify and correct
• Example of creating test cases for a payroll
application (p 291-293)
Test Coverage
• Statement coverage
• Branch coverage
• Basis path coverage
• Integration sub-tree coverage
• Modified decision coverage
• Global data coverage
• User-specified data coverage
Performing Tests
• Platforms
– Must be taken into consideration in the design of test
data & test scripts
– Decide which platforms are needed for test purposes
• Test cycle strategy
– Testers should determine the number and purpose of
the test cycles to be used during testing
• Use of tools in testing
– Test documentation
– Test drivers
– Automatic test systems and test languages
Performing Tests (cont)
• Perform tests
– Unit testing
– Integration testing
– System test
• When is testing complete
– When the application will perform as expected in production
– Quality goals have been met
– Number of open defects and their severity level
– Risk associated with moving the application into production
– Risk of not moving forward
• General concerns
– Software is not in a testable mode for this test level
– There is inadequate time and resources
– Significant problems will not be uncovered during testing
Recording Test Results
• Problem deviation
– “What is” can be called the statement of condition
• Activities involved
• Procedures used to perform work
• Outputs/deliverables
• Inputs
• User/customers served
• Deficiencies noted
– “What should be” shall be called the criteria
• Problem effect
– significance is judged by effect
– Can be stated in quantitative terms such as dollars, time and
units of production, number of procedures and processes, or
transactions
Recording Test Results (cont)
• Problem cause
– Is the underlying reason for the condition
• Use of test results – who should receive
– Developers
– End users
– Software project manager
– IT quality assurance
Defect Management
• Management
– Primary goal to prevent defects
– Should be risk driven
– Integrated into the development process
– Automated
– Defect information should be used to improve
the process
– Imperfect or flawed processes cause most
defects
Defect Management (cont)
• Defect naming
– Name of defect
– Phase or activity where the defect occurred
– Category of the defect
• Missing
• Inaccurate
• Incomplete
• inconsistant
The Defect Management Process

• Defect prevention
– Identify critical risks
– Estimate expected impact
– Minimize expected impact
– Methodology and standards
– Defensive design
– Defensive code
The Defect Management Process
• Deliverable baseline
– Identify key deliverables
– Define standards for each deliverable
• Defect discovery
– Find defect
– Record defect
– Report defect
– Acknowledge defect
• Defect resolution
– Prioritize fix
– Schedule fix
– Fix defect
– Report resolution
• Process improvement

You might also like