Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 21

QUALITY ASSURANCE

Project Name

Project Name TEST PLAN Draft Version 1.0 November 2009

QUALITY ASSURANCE

Table of Contents
1.1. Build and Deployment Process 1.2. Configuration Management 1.3. Hardware Specification 1.4. Software Specification 1.5. Reviewers 1.6. Disposition 15 15 15 15 21 21

Page: 2 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

REVISION HISTORY
Date 11/12/2009 Version 1.0 Initial Draft Description Author Tester who prepared

Page: 3 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

PURPOSE
This Test Strategy Document describes the scope, approach, resources, and schedule of intended testing activities for the PSAT/AP Integration with EID & EDW Project. It identifies the strategy employed for test preparation and testing. Additionally, it provides environment definitions, types of tests performed, role definitions, and identification of associated risks. NOTE: As the project progresses, this Test Strategy may change.

INTENDED AUDIENCE
The PSAT/AP Integration with EID & EDW Project team members are the intended audience for this document.

Page: 4 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

1. INTRODUCTION
This test plan outlines the standard tests that should be applied to PSAT/AP Integration with EID & EDW. This document will outline the strategy and approach that will be used to test to PSAT/AP Integration with EID & EDW project.
1.1.

Test Scope

The scope of the work to be performed includes the following: Development and implementation of a strategic, or Master Test Plan (this document). Development and implementation of a testing approach based on best practices. Establishment and implementation of verifying and validating application(s) compliance with business requirements and Use Cases. Identification of all the resources required to support the project testing initiatives. Primary Scope of testing efforts is to validate the back end data to be provided by this project to SDRS Retirement, AP On-line Score Reporting and Cross Program Reporting projects. Testing efforts focus on validating the integration of PSAT (POS) Student, Admin, Score Reporting and organizational information into the IODS (EID). Testing efforts focus on validating the integration of AP (APD) Student, Admin, Score Reporting and organizational information into the IODS (EID). Testing efforts include validating integration of PSAT data loaded into IODS (EID) into EDW as a base-layer for subsequent EDW Testing efforts include validating integration of AP data loaded into IODS (EID) into EDW as a base-layer for subsequent EDW.

1.2.

Test Items

Testing will consist of several phases; each phase may or may not include testing of one or more of the following items:
List of All Test Items Items To Be Tested

Integration of PSAT(POS) IODS (EID) Integration of AP(APD) IODS(EID) Integration of PSAT data loaded into IODS (EID)into EDW Integration of AP data loaded into IODS (EID)into EDW

Page: 5 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE
Items Not To Be Tested
{Usability} {Reliability} {White-box testing}

If not, why not


{No usability spec} {No way to get meaningful uptime testing in QA} {No white-box resource on the test team}

1.3.

Test Objectives

The Test Objectives for this effort are: To validate that the functionality defined as within scope in the Business Requirements, Appendices, Use Cases, Technical Specification/Requirements, and Supplemental information To identify, report, and track through resolution, all software problems encountered during the System Integration Test and User Acceptance Test. To test and certify that this project does not adversely affect the baseline functionality of existing applications and that those applications continue to work as they do today. Mapping the requirements with the test cases. To adhere to College Board Corporate System Life Cycle Methodology (SLCM) throughout the testing process. Enforce a controlled test environment that simulates the production environment and preserves the integrity of the tests. Build and retain a test team that is knowledgeable about the PSAT/AP Integration with EID & EDW and its related projects. Provide management with regular updates on the status of the tests.

The POS1 (PSAT) database shall be one of the sources for iODS.

Page: 6 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

2. ASSUMPTIONS AND RISKS


2.1.

Assumptions

Any changes in project schedule and resources may delay QA delivery date. A stable QA environment exists. Any change to the QA environment must be approved by QA personnel. The version of the operating system, application server software, and all service packs match production Availability/Conflict issues in the QA environment may delay delivery date. All severity 1 and severity 2 defects are resolved prior to production release. Not all defects will be resolved prior to production release. All human resources scheduling will be based upon a five-day, 40 hour workweek. During the test process all required data feeds would be available in the test environment. Change Control procedures are followed. All functional requirements are properly defined and detailed enough to meet testing needs. Testing will occur on the most current version of PSAT/AP in the test environment. The project team will be responsible for the timely resolution of all defects. Stakeholders will review each outstanding defect at completion of QA cycle for go and no/go criteria.
2.2.

Risks

The QA teams testing may be impeded by the following risks: Functionality changes implemented late in the development cycle will impact the testing cycle. Browser/OS/Hardware combinations cannot be exhaustively tested. The test environment shares resources with other applications (e.g., processor power, memory and disk space). In other words, other applications outages could take our QA instance down. Obtaining support for third party tools may introduce delays.
2.3.

Referenced Documents

The following documents are related to and/or referenced by this document:

Page: 7 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE Document and Description PSAT/AP Integration with EID & EDW Scope Statement Version 2.0 Location Z:\edmvob\IT Projects\PSAT-AP Integration into EID & EDW\Requirements

3. TEST STRATEGY
The test strategy consists of a series of different tests and testing types that will fully exercise the PSAT/AP Integration with EID & EDW system. The intent of these tests is to verify that all requirements are met, to uncover the system limitations, and to measure its full capabilities. Business Requirements and/or Use Cases will be used as the basis for developing Test Cases, Test Scenarios, and Test Scripts. The scripts, scenarios, and cases can be reused for regression testing. Additionally, they can be modified to test new functionality in future releases. Following practices will be followed as a part of test strategy: Testers can perform testing from their regular workstations where it is possible. Test results must still be coordinated with others. Identified testing participants will receive instructions prior to the start of testing regarding the availability of system and environments they need to access. Test scripts/cases and scenarios will be prepared and shared with the rest of the team Test participants will conduct the tests and document detailed results as testing progresses. Defects if any will directly be logged into CQTM and the team notified to efficiently rectify the defects in a timely manner.
3.1.

Test Cases

The fundamental purpose of the Test Case is to ensure the accurate mapping of requirements to test items. The following application(s)/software will be used to that end. The tools listed below provide the means by which the tester and business user agree on the accuracy and completeness of the testing of the systems requirements. The Test Case should also note the type of test or tests performed in order to most completely match the Use Case requirements.
Tool Name Purpose / Definition

Rational RequisitePro (ReqPro) Test Permutation Spreadsheet (TPS)

ReqPro is the tool to create traceability relationships between business requirements / use cases and test cases TPS is the tool to provide a structure for identifying tests, and to facilitate mapping back to the requirements source.

Page: 8 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE Here are some guidelines for creating Test Cases: Test Case to Use Case Traceability should be defined to ensure complete and adequate coverage. Each Test Case should match with the Use Case as much as possible. Each Test Case should address specific functionality. Each Test Case should be developed during the Test Preparation phase. Test Cases should include test scenarios that are appropriate to the Use Case. Upon the completion of each Test Case, the test duration should be specified. Test data for each Test Case should be specified. Each tester should update Test Cases for use during test execution. The following Figure shows an example of a Test Case to be developed, a brief description, and the estimated length of time for developing the scenario. The estimations assume that X tester(s) are available for developing the scenarios. A complete listing of Test Cases can be found at {alternate location, link, appendix, etc.} # Type Test of Case Test Use Case Reference Test Case Description Priority Expected Duration

3.2.

Test Scenarios

Test Scenarios are collections of test conditions (i.e., security, permissions, customer, reference data, functional area, etc).

Test Scenario steps will be specific and repeatable. They will include information such as setup parameters, input files/data, intermediate files/data, and output files/data. See Figure below. A complete listing of Test Scenarios can be found at {alternate location, link, appendix, etc.}
# Test Scenario Test Case Reference Input Data Action / Process Output Data Priority Actual Duration

Here are some guidelines for creating Test Scenarios: Each Test Scenario should address specific functionality within a Test Case. Positive and negative test scenarios should be developed. Upon the completion of each Test Scenario, the test duration should be specified. Test data for each Test Scenario should be specified. Each tester should add/update the Test Scenario for use during test execution.
Page: 9 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

3.3.

Test Scripts

Test Scripts are collections of test steps organized in the sequence in which they are performed, along with expected results and verification process. Test scripts are typically used in conjunction with an automated testing tool or to delineate a particularly complicated or detailed test scenario. Test Script steps will be specific and repeatable, but unlike Test Scenarios, contain detailed blocks labeled Test Step, Expected Results. See Figure below. A complete listing of Test Scripts can be found at {alternate location, link, appendix, etc.}
# Test Script Test Scenario Reference Test Step Expected Results

3.4.

Test Data

The test team will derive test data from {list data sources here} {Also list here the data creation strategy, if any} {This section should include any need to refresh data}
Data Types & Source Database Data Creation Strategy Responsible Party Addl Considerations

3.5.

Test Approach

The PSAT/AP Integration with EID & EDW Phase 1.0 testing approach will include:
3.5.1.

Unit Test

Applies to all application code, interface programs, database stored procedures, and other utility code. Objectives are to demonstrate that: Developers will perform manual unit testing. Individual developers are responsible for unit testing their code. Team leads and architects will review. Development teams are responsible for spot-checking and enforcing practices and standards for coding and unit testing. Unit testing will take place continuously during construction. Unit test cases will be added whenever functionality is added, code is reorganized, or defects are fixed.
3.5.2.

Smoke Test

Page: 10 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE Applies to code chunks or full code builds delivered from the Development Team. Objectives are to demonstrate that a build intended for further testing is correctly installed on the target test environment and is working as expected. Objectives do not include performance testing. Shakeout tests will be executed by the team(s) designated as the responsible party. Shakeout tests will be selected from a set of string or system test cases prior to the shakeout of a build in the test environment to focus on areas of anticipated vulnerability for the environment and code. Shakeout tests will be performed whenever a new build is installed in the Test, External Integration, Acceptance and Production environments. Shakeout tests will use a limited number of regression test scripts built during string or system testing to confirm proper operation of a build in the target environment.
3.5.3.

System Test

System Test will include validation and verification of the followings but not limited to:

Data Completeness. Ensures that all expected data is loaded. One of the most basic tests of data completeness is to verify that all expected data loads into the data warehouse. This includes validating that all records, all fields and the full contents of each field are loaded. Strategies that will be considered include: o o Comparing record counts between source data, data loaded to the warehouse and rejected records. Comparing unique values of key fields between source data and data loaded to the warehouse. This is a valuable technique that points out a variety of possible data errors without doing a full validation on all fields. Utilizing any data profiling tool that shows the range and value distributions of fields in a data set. This will be used during testing and in production to compare source and target data sets and point out any data anomalies from source systems that may be missed even when the data movement is correct. Populating the full contents of each field to validate that no truncation occurs at any step in the process. For example, if the source data field is a string (30) make sure to test it with 30 characters.

Data Transformation. Ensures that all data is transformed correctly according to business rules and/or design specifications. Validate that data is transformed correctly based on business rules can be the most complex part of testing an ETL application with significant transformation logic. One typical method that will be used in validation is picking some sample records and "stare and compare" to validate data

Page: 11 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE transformations manually. Here are some simple automated data movement techniques that will be utilize may include: o Creating a spreadsheet of scenarios of input data and expected results and validate these with the business customer/Subject Matter Expert (SME). o Creating test data that includes all scenarios. QA may need the help of an ETL developer to automate the process of populating data sets with the scenario spreadsheet to allow for flexibility because scenarios will change. o Utilizing data profiling results to compare range and distribution of values in each field between source and target data. o Validating correct processing of ETL-generated fields such as surrogate keys. o Validating that data types in the warehouse are as specified in the design and/or the data model. o Setting up data scenarios that test referential integrity between tables. For example, what happens when the data contains foreign key values not in the parent table.

Data Quality. Ensures that the ETL application correctly rejects, substitutes default
values, corrects or ignores and reports invalid data. To ensure success in testing data quality, QA will include as many data scenarios as possible. For example: o Reject the record if a certain decimal field has nonnumeric data.
3.6.

Test Tools and Techniques

The idea is to create expected results and then compare those with the actual results. The test team has a number of generic tools to accomplish this. In addition, a number of custom scripts will have to be developed to test project specific functionality like Data Integration. The tools used primarily are VBScript, CQTM, Toad and Rational ClearCase.

Page: 12 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

4. 4. RESOURCES AND SCHEDULE


4.1.

Resources and Responsibilities

This section presents the recommended allocation of QA resources for the test effort and their responsibilities.
ROLE Project Manager RESPONSIBILITIES Communication with customer to agree on the scope of QA Agreement of acceptance criteria with the customer prior to commencing System Test Assist QA with the creation of a detailed test plan Assist QA with the creation of detailed test cases Ensure that a detailed test plan is available for QA Ensure that user IDs and passwords for all testers have been created and distributed prior to the start date of System Testing. Ensure that bugs identified during Functional Testing are logged in System Test Cases spreadsheet located in ClearCase and these issues are communicated to the development team on a timely basis. Ensure testing takes place within agreed upon timeframes Execute test scripts/cases to ensure the application performs at an acceptable level. Document testing results. NAME

Business Analyst

System Test Lead

Sherry Chen

Tester(s)

Krishna Levaku

4.2.

Testing Schedule
Deliverable Test Plan Test Cases Depend ency Duration 1 day 1 day 3 days RUP Phase (if applies) Construction Elaboration Elaboration Responsible Party IG QA QA

Milestone Description Environment Preparation Create Test Plan Create Test Cases Smoke Test Interface Test

Page: 13 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE Milestone Description System Test Regression Test Performance Test Failover & Recovery Test Intrusion Test User Acceptance Test
4.3.

Deliverable System Test Summary Report

Depend ency

Duration 5 days

RUP Phase (if applies) Construction

Responsible Party QA

Requirements Traceability

All the requirements of PSAT/AP Integration with EID & EDW project will be mapped to corresponding test cases in Microsoft excel. No requirement will be left without mapping to a test case.

Page: 14 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

5. TEST ENVIRONMENT
The build will be delivered to X environment.
1.1.

Build and Deployment Process

The Project team receives email from the X team once the build had deployed.
1.2.

Configuration Management

TBD The CM is responsible and the details are beyond the scope of the tester.
1.3.

Hardware Specification

TBD The client machine that will be used to test the application runs 2.66 Core 2 Duo, 2 GB DDR2 667 MHz Ram, 160 GB HDD and DVD R/W drive among others.

1.4.

Software Specification

TBD The project involves Oracle 11g, WebLogic and Java. The OS is Windows XP for client machines and UNIX for backend and servers. System Component EDW Database Database Instance / Schema POS1 Database Database Instance / Schema IODS Database Database Instance / Schema UNIX Server Description EDW1.QA, EDW1.QA3 EDW1 POS1.QA, POS1.QA3 PSAT EID1.QA, EID1.QA3 IODS oqsgeninfl20.qa.collegeboard.cb
Page: 15 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE System Component Informatica Description PowerCentre 8.5.1 client > qa_repo

6. TEST STATUS REPORTING


In the planning stages, test status will be reported via test status reports. Issues will be raised, addressed, escalated and resolved as needed. During test execution, defect/problem reports will be generated at specified intervals.
6.1.

Defect Reporting

When a problem is discovered, it is logged in ClearQuest, it will be reviewed and assigned to a developer for resolution. When a software fix has been completed, the Tech Lead / Project Manager must indicate in Clear Quest that the defect is ready for retest. This means that the fix has been done, and is waiting to be migrated. Defects will be repaired in the development environment, unit/string tested, then promoted to the system test environment. At that point a fix can be system tested and then closed. Both the Development and the Test Teams will update the status of problems in Clear Quest. Problems will be classified along the following Severities: Critical - A problem that brings test execution to a halt. The problem initiator will notify the Development Team Lead or Coordinator immediately of the specific problem encountered. Major - A problem that severely impedes the progress of the test, but does not totally suspend execution of the test. This problem prohibits completion of a specific function of the test, and needs to be resolved in a timely manner prior to implementation. Average - A problem that is not only cosmetic in nature, but also has an impact on a specific functionality, but does not severely impede the progress of the test. Minor - A problem that is cosmetic in nature, with a low priority it does not prohibit the execution or Implementation of the release. Initially priorities will be assigned by the QA team at the time of submitting the defects, which may be reassigned a higher or a lower priority by the Project Manager upon triage. The three priorities that will be assigned are High, Medium and Low. All Emergency, Severity 1 and 2 defects found in the current iteration will be resolved prior to implementing the next iteration of the PSAT/AP Integration with EID & EDW application in the QA environment.
Page: 16 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

6.2.

Metrics

QA team will produce following metrics during different test phases: Execution Metrics Planned Test Cases executed Actual Test Cases executed Test Cases passed Test Cases failed Test Cases not executed / not executable Rate of scenario execution Defect Metrics Defects opened, current period Defects closed, current period Mean time to close defect Rate of defect closure Risk Assessment Metrics Defects outstanding by severity Defects outstanding by priority
6.3.

Change Control Process

When a software fix has been completed, the Development Manager or Lead must indicate in Clear Quest that the bug is ready for retest. This means that the fix has been applied and unit tested, and is waiting to be deployed. Defects will be repaired in the development environment then promoted to the system test environment. Once the fix is deployed to the test environment, the Tester is responsible for retesting and closing the problem when it is resolved. If the problem has not been resolved, the tester should update the bug with comments to that effect. Basic happy path shown in diagram below:

Page: 17 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE

7. PRODUCTION READINESS
7.1.

Delivery Plan

Once all QA test types have been satisfied, the PSAT/AP Integration with EID & EDW Production readiness efforts will be able to commence. The start of this activity is the milestone that marks the successful completion of all testing activities defined to support the system development life cycle for the PSAT/AP Integration with EID & EDW project.
7.2.

Pre-Deployment Checklist

TBD
Only upon signoff by QA in terms of a release readiness document that the product shall be deployed in Production

7.3.

Production Validation

TBD

Page: 18 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE QA will be available to support to the application on the morning of deployment to validate essential checkpoints during the process.
7.4.

Post-Production Monitoring

TBD
QA team will allocate a resource part time to ensure, issues if any found post production have been tackled in a timely manner.

8. APPENDIX
8.1.

Terms and Abbreviation


Definition or Reference Enterprise Data Warehouse Integrated Operational Data Store Change Data Capture ( A utility within Informatica Power Exchange ETL tool) Extraction, Transformation and Loading

Term EDW IODS CDC ETL

8.2.

Test Type Definitions

Below are listed some examples of test types with a brief description. PSAT/AP Integration with EID & EDW project may only use a subset of these. Algorithmic: Testing that validates that all calculations are accurate. Algorithms will be validated against requirements and will be tested with maximum, minimum, nominal, and erroneous input values. Black-box: Testing based on what an application is supposed to do. Also called functional testing.

Page: 19 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE Boundary: Boundary tests confirm the minimum and maximum values that a field will accept. Boundary tests entail inputs far below, just below, at, just above, and far above the stated requirement for an input value. Erroneous: Erroneous input tests check the processing of invalid inputs. Testers select values that test the range and types of input data by inputting values that are known to be invalid (e.g., alphabetic characters in a numeric field). Integration: Testing that seeks to verify the proper functioning between and among groups of components Interface: Testing to verify accurate exchange of data as well as ensuring that processing times are within an acceptable range. Interfaces may be internal as well as external. Intrusion: Testing that applies to all application functionality, but focuses on analysis of vulnerability to Internet-based penetration and security access controls and data encryption techniques. Load: Testing to ascertain the performance at prescribed user levels. Load testing is more appropriate when attempting to discover the performance signature of the system. Contrast with performance and stress testing. Performance: Testing to ascertain the performance of the application as it appears to the end user (page response time) and its components (app server, database server, etc.) at varying user levels. This type of testing assumes system degradation; it merely determines whether the rate of degradation is acceptable. Production Validation: A checklist of items to be done on the morning of deployment on the production environment but before the application is open to the public. A smoke test of major functionality. Regression: Not specifically a standalone test type but is performed for any change to a system component that can place that system and all associated and interfacing systems at risk. Change can have far reaching effects, and regression testing validates that change does not cause previously approved functionality. Regression testing validates that no previously approved function, application, system, or component has been compromised by a change. This type of testing may or may not use an automated tool. Stress: Testing to push a component or a system to its breaking point. A means of ascertaining how many users the system can host before component x fails completely. Smoke: Testing to demonstrate that a build intended for further testing is correctly installed on the target test environment and ready for test. This type of test is performed during the initial test phase. System / Functional: Testing the behaviors, functions, and responses once the system is completely built and has passed integration testing. System tests include compatibility testing; navigation testing that includes validation that all screens and navigation points can be reached for each specific role. System testing can include negative path testing (i.e., testing of invalid data inputs and user actions). Unit/Class: Testing individual classes/components/programs/modules or objects of a system. Basic test of the developed components of a system. Validating the building blocks or foundation of the change.

Page: 20 of 21

College Board Proprietary and Confidential

QUALITY ASSURANCE User Acceptance: Testing includes testers executing production-like scenarios using formal test cases with specified inputs and expected outputs. This type of testing is NOT done with the intent of finding bugs; rather, it is done to demonstrate sufficiency and correctness.

9. Document Review Status


1.5.

Reviewers

List of reviewers and their associated roles.


PARTICIPANTS NAME ROLE

1.6.

Disposition

If the artifact has been reviewed and successfully met the criteria specified for this artifact, the artifact should be marked as "Approved".
Approve Changes Required

Page: 21 of 21

College Board Proprietary and Confidential

You might also like