Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 20

PTS# 13697

NASR Geographical Objectives Reporting

System Test Plan

Current Version: 0.1

Date Last Updated:

Last Updated By:

Author: Venu Bandaru

Date Created: 21 June 2010

Approved By:

Approval Date:

NASR Geographical Objective Reporting June 21, 2010 Page 1


Revision History
Version Date Revision Author Summary of Major Changes
Number Updated Made
0.1 21/06/2010 Venu Bandaru Initial Draft created

NASR Geographical Objective Reporting June 21, 2010 Page 2


Table of Contents
1 Introduction..............................................................................................5
1.1 Objectives & Scope..................................................................................5
1.2 Assumptions, Exclusions & Constraints..................................................6
1.3 Test Readiness Review............................................................................6
1.4 System Test Entry Criteria........................................................................7
1.5 Schedules for Test execution...................................................................7
2 Test Case Development...........................................................................8
2.1 Test Case Execution................................................................................8
2.1.1 Execution Sequence.........................................................................8
2.1.2 Functionalities In scope....................................................................9
2.1.3 Data Acquisition................................................................................9
2.1.4 Test Case Mapping...........................................................................9
2.1.5 Validation..........................................................................................9
2.1.6 Estimates for Test case preparation.................................................9
3 Test Preparation.....................................................................................10
3.1 Test Environment...................................................................................10
3.1.1 Hardware.........................................................................................10
3.1.2 Software..........................................................................................10
3.1.3 Tools................................................................................................10
3.2 Prerequisite Training..............................................................................10
3.3 Preparation Schedule............................................................................10
4 Test Execution........................................................................................11
4.1 System Testing (ST)...............................................................................11
4.2 Location..................................................................................................12
4.3 Testing Tasks..........................................................................................12
4.4 Risks and Contingencies.......................................................................14
4.5 Test Resource........................................................................................14
4.6 Test Schedule.........................................................................................15
4.6.1 Overview of System Testing Schedule...........................................15
5 Acceptance Criteria................................................................................16
5.1 Pass/Fail Criteria....................................................................................16
5.2 Suspension Criteria and Resumption Requirements............................16
6 Test Tracking and Reporting..................................................................17
6.1 Recording Testing Events......................................................................17
6.2 Exception and Defect Handling.............................................................17
6.3 Test Progress Planning and Tracking....................................................18
7. Attachments...........................................................................................19
7.1 Appendix 1- Roles & Responsibilities.......................................................19

NASR Geographical Objective Reporting June 21, 2010 Page 3


NASR Geographical Objective Reporting June 21, 2010 Page 4
1 Introduction
The North American Sales Reporting (NASR) system allows U.S. General Motors corporate and
field personnel to view and run reports as well as run ad hoc queries of the data that is stored in
the Vehicle Sales North America (VSNA) database.
The objective of this project is to provide GM Sales & Marketing greater flexibility in managing
and reporting the sales objectives of their dealers (BACs). Currently, NASR obtains monthly
objectives from the sending vendor at a BAC/model level. This single geographical level (BAC)
is used within NASR as the base for reporting upper level geographies (i.e. Region, Area,
Section, Zone) by summing the BAC level objectives. The GM business needs to manage the
BAC level objectives to, and in some cases, exceed the sum of the Area, Zone, Region, and
National objectives while keeping these geographical objectives constant. The existing process
of rolling up BAC objectives does not provide the flexibility required to run the business and
needs to be modified. This “roll-up” process must be eliminated and replaced by the computed
objectives obtained from the Vendor for each level of geography. The following are in scope of
the project:
 Load 7 separate geography files by model independent of the other geographical levels at
any time of the month
 To be able to block any level of objectives from being seen on reports at model level
 Ability to manage changes to BAC level objectives at any time of the month

This System Test plan will help the team to track progress to the goals, to identify and control
risks and to maximize the success of the System Test phase and consequently, the success of
the whole project.

This Plan also lists out the various environments that would be used in the different phases of
System Testing, the resources; and their roles and responsibilities.

The System Test Plan documents the following:


 Schedule for developing the test cases based on the Business and System Requirements.
 Schedule for planning and gathering test data
 Schedule for preparing the test environment
 Schedule for conducting the tests, reporting test results, and re-testing as necessary

The functionalities that will be tested as part of this project are summarized below at a high level:
 Sending vendor flat file format change
 Storage of Monthly Objective files by NASR
 NASR “Blocking” flag for Model and Geography level reporting
 NASR Mid-month BAC objective changes
 Reprocessing the 7 geographical levels at mid-,month
 Summing historical objectives
 Data load Status screen updates
 NASR reporting changes based on the geographical levels and blocking settings and footer
change from “GMIA” to “SRDM”.

1.1 Objectives & Scope


The objective of System Testing is to Test all the individual modules, routines and programs
which are in scope of the project. This will ensure that the components of NASR Geographical

NASR Geographical Objective Reporting June 21, 2010 Page 5


Objective Reporting solutions are functioning as expected. The entire NASR Geographical
Objective Reporting solutions are tested, together with interfaces to all relevant legacy and
external systems, for:
 Defect detection and removal.
 Ensuring that the system meets the approved requirements, Validating that the system
works as expected by the user.
 Validating that the application can be utilized over all the recommended operating systems/
browser configurations on which it is supposed to run.
 Verifying that the system as a whole works to meets the functional requirements.
 Verifying that the user interfaces meet the user interface requirements.

This System Test phase will be managed and executed by the Capgemini Testing team.
1.2 Assumptions, Exclusions & Constraints
Assumptions:
1. The Development task should be completed for the respective work streams as per the
High Level Schedule.
2. Unit and Component integration testing are completed
3. All the Defects from the Requirement Phase to the Coding Phase are fixed and resolved.
4. All the code is centrally stored in PVCS.
5. All the Test Environments are stable as expected.
6. The System Testing will be performed in the Pre-Production environment
7. Production data and necessary Login IDs are provided before System testing in Pre
production environment. (Note: pre-production data is essentially a copy of the NASR
production data).
8. All the necessary testing resources are available during testing.

Exclusions:
1. NASR reports and functionalities outside the scope of the requirements identified in the BRD
and SRS will not be tested. Examples of these are reports generated on data retrieved from
MADM, Employee data, etc. are not impacted by this project and will not be tested in this
phase.

Constraints:

1. Timely closure of all component Integration Testing issues is necessary for the completion of
System Test as per schedule
2. Sending Vendor should send the file in a timely manner to Pre-prod environment prior to
systems integration testing.
3. Hosting team should be providing timely support to setup Pre-prod environment
4. There should be no delays from CAB for moving code to Pre-prod.

1.3 Test Readiness Review


The System Test Readiness Review will be conducted prior to the start of the System Testing.
This review will verify the readiness of the testing environment, successful completion of the
previous test phase (Unit Testing) and approvals of the System Test Plans and Cases. Similar
Readiness reviews will be conducted prior to each testing phase.

NASR Geographical Objective Reporting June 21, 2010 Page 6


1.4 System Test Entry Criteria
The following Entry and Exit Criteria will be used to enter and exit a Testing phase:

Entry Criteria:
 Ensure the availability of approved Test Plans.
 Ensure the availability of approved Test Cases.
 QA Test Lead review is available.
 Validate that all the Defects from the previous Testing phase have been fixed.
 Verify Pre-prod environment is available for testing as applicable.
 Validate all hardware and software components in the environment are accurate as specified
in the Test plan. If Test plan is in error, a change control must be submitted.
 Confirm the application installed is the correct version for Testing.
 Identify if there are other activities in the environment which may impact testing. Ensure that
the testing schedule is adjusted if necessary to avoid conflict of resources
 Conduct preliminary (Sanity) Test to ensure installation is successful
 Ensure Data required for Testing is available.
 Confirm all users ID’s to be used in Testing have been built and perform accurately.
 Obtain a sign-off from the Development Team to conduct Testing.
 Obtain change management approval for moving code to Pre-prod for testing.

Conduct Test and Exit Criteria:


 Conduct System and Integration Testing.
 Compare actual results to expected results.
 Investigate and resolve discrepancies / Defects
 Retest the failed functionalities
 Prepare Test Results report
 Present Test results report to project team for Approval
 Ensure all the defects are fixed (In the event of not being able to fix all the defects, ST phase
will be exited only on approval of core project team and Project stakeholders, including
Business and Sustain leads).

Triage Process.
If the Development Manager feels that a defect is not valid, he should bring it to the notice of the
Test Manager. If both are not in agreement, then the defect will be brought into a triage meeting
to determine a proper course of action. The Development Manager, Test Manager and Program
Management should attend the regular triage meetings. Triage meetings will be scheduled to
start once testing and reviews of defects begin. It is understood that triage meetings are used to
prioritize defects, re-assign defects (if needed), and come to agreement on defects under
dispute. Defect triages are not meant to be used to discuss defect fixes. Development Manager
is in charge of setting the agenda for this meeting.

1.5 Schedules for Test execution


The Test execution schedule is covered in the NASR Project Schedule

<Need to implant Final version of Project schedule>

NASR Geographical Objective Reporting June 21, 2010 Page 7


2 Test Case Development

NASR Geographical Objective Reporting June 21, 2010 Page 8


Activity
Specify System Test Cases Refer 13697-ExpressProjectSchedule
Review and approve list of System Test Cases Refer 13697-ExpressProjectSchedule

2.1 Test Case Execution


Test cases will be created for to validate all the requirements and functionality described in the
scope of the project. These Test cases walks the Tester through the steps required to Test the
functionality/ requirements as well as documents what the expected results should be.

System Testing includes the following:

Sanity Testing: The system will be allowed to run normally for a while to ensure normal
performance, without any abnormal errors/logs.

Regression Tests: System testing includes running Regression tests which are based on NASR
Geographical Objective Reporting last Release functionality.

Functional Tests: This covers all the functionalities developed as part of NASR Geographical
Objective Reporting project.

Data Conversion and Data Load Testing: this test will cover the validation of data in the db
tables related to the 7 geographical level objectives

System Integration Tests (SIT): SIT test cases include test steps covering external
components.
Project: INE03_GMAD_NASR_Obj_AM Documents > NASR Geographical Objective Reporting
> Plan and Define > Baselined

All the test cases should be available in SourceForge and in TRoom.

System testing includes the complete Test preparation procedure that is described in the
following sections.

2.1.1 Execution Sequence

The Test execution sequence describes the Test scenarios, Test cases and Test data that will be
tested. The sequences are developed as follows.

Step 1: Identification of Test Scenarios and Test cases.


Step 2: Documenting the Test Cases from the Test Scenarios.
Step 3: Mapping of the Requirements and Use Cases with the Test Cases (RTM).
Step 4: Peer Review of the Test Cases. (Internal peer review within the supplier team +
acceptance review by GM)
Step 5: Updating the test cases as per Review comments.
Step 6: Obtaining Approval from the GM Project Manager after Review and re-work activities

2.1.2 Functionalities In scope


Following are the functionalities in scope for this project:
Business Requirements:

NASR Geographical Objective Reporting June 21, 2010 Page 9


 Sending Vendor Process Changes
 NASR Storing Monthly Objective Files(s)
 NASR Reporting Changes
 NASR Blocking Flag For Model and Geography level Reporting
 NASR mid Month BAC Objective Changes
 Reprocessing the 7 Geographical levels At Mid Month
 Summing Historical Objectives
 Data Load Status Screen Updates

2.1.3 Data Acquisition


Following Test data is required in the testing environment.
 Flat files with Objective data for the 7 geographical levels from the Sending Vendor
 Existing copy of production data, copied into pre-prod environment)
 User ID and Password to login to the system

National, Region, Section, Area, Zone, GMMN and BAC will be loaded before the beginning of
the System Testing; Database Development team has taken the responsibility of loading this
data. Necessary User IDs are must for the beginning of system testing. At least one ID for each
Role is needed for testing purpose. Sustain team creates the login IDs per the request.

2.1.4 Test Case Mapping


The Test cases will be mapped to the respective Use Cases and System Requirements in the
RTM to Ensure that the all requirements are adequately covered.

2.1.5 Validation
The system Test cases will be validated with the Test data available in the database for system
Testing. This validation requires Data base access to testing team. In case the access is not
available to the testing team, the validation will be done with the help of Development/Sustain
team.

2.1.6 Estimates for Test case preparation

The Test cases review and preparation is one of the key drivers for the entire project. The current
approved version of the System Requirements Specification will be the Reference document for
the Test cases

The Capgemini Test team will Test cases to ensure cover all the enhanced features of NASR
Geographical Objective Reporting During Project Estimation, Test case preparation activity is
estimated based on a set of data extracted from Business Requirement and System
Requirement documents.

Estimation Approach:

 Understand the Requirements for NASR Geographical Objective Reporting in detail:


Capgemini Test team actively reviews various documents to understand the exact
Requirements.
 Classify the Requirements into simple, medium and complex. The understanding of the
requirements prior to this activity is critical for classification of Requirements.
 Determine the effort required for Simple, Medium and Complex Requirements.

NASR Geographical Objective Reporting June 21, 2010 Page 10


 Apply the effort uniformly across all the requirements (UC Level) to arrive at the Total Effort
that is required for preparing Test cases.

3 Test Preparation
3.1 Test Environment
Each component developed is required to undergo sufficient Testing at each phase. The team
needs to create and review Test cases, Test data as an entry point to a Testing phase and
successful Test results as an exit point from a Testing phase before migrating to the next phase
of Testing

3.1.1 Hardware

3.1.2 Software
The spreadsheet below contains the Hardware and Software Requirements for Testing

Hardware and
Software Requirements.XLSX

3.1.3 Tools

Source Forge for the Defect management and PVCS for the code Repository are the Tools Used.

Source Forge URL:


https://coconet.capgemini.com/sf/tracker/do/listArtifacts/projects.in_os_NASR/tracker.defect_trac
ker

For PVCS:
http://10.156.0.180:8080/GM2006_prod?jsp=login

3.2 Prerequisite Training

New testing resources will be trained on the application to bring them to the speed in order to run
the test cases

3.3 Preparation Schedule

Following is the list of pre-requisites that have to be met before the Test Team can conduct
System Testing:

 Approved Requirements Specifications, System Requirements document and Design


document are the baselines for Testing. All mentioned items should be approved before
testing phase begins.

NASR Geographical Objective Reporting June 21, 2010 Page 11


 System Test cases (including updated RTM) should be completed, reviewed and approved
 Code Reviews of the implemented features should be completed prior to releasing the code
for Testing.
 Modules should be unit Tested by the Developers and unit Test logs handed over to QC
along with the Release Notes before System Testing phase begins.
 Test site setup and necessary connections to the Databases should be established.
 Complete creation of the Portal Test environment including class files, Property files, DB
stored procedures, DB tables modified and new tables added, Configuration Files update.

Please refer to the Project Schedule which can be found in Source Forge under: “SDP-R10
Deliverables/Project Plan/Project Plan Schedule (NASR)” for NASR Geographical Objective
Reporting planned schedule for Test preparation activities.

Following are the list of pre-requisites that have to be met before the Test team can conduct
System Testing (ST):

 Previous Testing completed with no issues outstanding (Unit, Regression).


 System Test lead assigned from each of the “in-scope” NASR.
 System Test cases should be completed and reviewed.
 System testing resources should be identified and scheduled for system Testing activities.
 Preprod site setup and necessary connections should be established
 Complete creation of the Test environments
 Test Data prepared and loaded.
 Tester access to Test environments, and Test tools available (if necessary)

4 Test Execution

4.1 System Testing (ST)

Test Cases will be designed by testing team according to the new requirements as per the BRD
and SRS.

System Testing will take place over a period in which Testers will undertake a dedicated effort to
Test the system against specific requirements. All system Testing must occur against
documented Test Cases.

System Test cases will be prepared by the Testing team in Capgemini. The Test Lead will review
the Test cases and manage the Testing team. The main responsibilities of the Test Lead are:

 Identifying the Testing resources for the NASR.


 Coordinating Testers during the Testing Phase.
 Managing the progress of system testing against the system Testing schedule.
 Defects are logged and managed in a common sharing place preferably Source Forge.
 Coordinating re-Testing of Issues.
 Reporting the daily system testing progress to Management
4.2 Location

NASR Geographical Objective Reporting June 21, 2010 Page 12


The execution of System Testing will take place within NASR Application (PreProd environment)
by using direct link to NASR and navigating to NASR Administration functionality from there.

4.3 Testing Tasks


The Testing team shall conduct Testing all the components / features developed for the project
for the new requirements that came for this release.

 Regression Testing – Set of regression test cases based on NASR existing functionality.
 GUI and Functional test - Test cases that ensure the component functions are as required.
 Negative Testing: To ensure that application handles the wrong input properly
 Boundary Value testing: All the Text fields are tested for boundary values if the values are
available in SRS.
 Error Conditions - Test cases that ensure the component handles Error conditions.
 Exception – Test cases that cover special logic and/or exceptions.
 Unexpected Data Input – Test cases to cover entry of invalid characters example: user
entering illegal characters in their user ID, or unsupported characters in their address, etc.
 Security Level Validation: Test cases will include the Security checks or the Privileges
given to the User. As per the Security Requirements system works on role based access and
data level authentication is done based on dealer code and sales organization.

A Penetration Test is a process involves an active analysis of the system for any potential
vulnerabilities that may result from poor or improper system configuration, known and/or
unknown hardware or software flaws, or operational weaknesses in process or technical
countermeasures. This analysis is carried out from the position of a potential attacker, and can
involve active exploitation of security vulnerabilities. Any security issues that are found will be
presented to the system owner together with an assessment of their impact and often with a
proposal for mitigation or a technical solution.
The HP Weblogic tool will be used to conduct the Web Scan for security testing.
Web application penetration testing refers to a set of services used to detect various security
issues with web applications

Following tasks will be performed as part of NASR Geographical Objective Reporting Testing
activities:
 Test Team will use the system Test cases to perform System Testing.
 The status of each Test case will be recorded in the Test case document.
 Defects will be raised in a shared defect log tool (in source Forge) for all failed Test cases.
Each defect will be tagged with a priority. The priority levels that will be used are mentioned
in the table below.

Priority Level Description


P1 Defects that will have very high visibility to the Customer (being 5
or more dealers) and/or will have major impact to a given service
such that it is viewed as unavailable by the end Customer and for
which there is no workaround available. This Defect may also
affect a critical System, Service, or loss of access, which

NASR Geographical Objective Reporting June 21, 2010 Page 13


incapacitates an entire site or threatens to compromise all sites.
P2 Defects that will have high visibility to the Customer (being 5 or
more dealers) and/or will have high impact to a given service such
that it is viewed as nearly unavailable by the end Customer but
for which there is a proven workaround available This Defect
may also affect a critical System, Service, or loss of access, which
incapacitates an entire site or threatens to compromise all sites..
P3 Defects that will have limited visibility to the Customer and/or will
have limited impact to a given service such that it may be viewed
as operational, but in a degraded mode, by the end Customer
and for which there is a proven workaround available.
P4 Defects that will have low visibility to the Customer and/or will
have minor impact to a given service and do not limit the end
Customer in functionality and for which there is a proven
workaround available. Example, loss of a server in high
availability fashion with no loss of performance.

 Fixed Defects will be verified by the Test team and closed in (in source Forge) shared folder.
The status of failed Test cases will be updated in the Test case document. In case of a wrong
fix, the Defect will be reopened and reassigned back to the Developer. This cycle will be
followed until the Defect is properly fixed and closed.
 At the end of Testing, Test Lead will prepare a Test Report that provides details about the
Defects found during Testing. This will be reviewed by Capgemini Development Manager
and Test Coordinator and delivered to GM Project Manager.

4.4 Risks and Contingencies

All project risks and contingencies have been documented within a separate Issue and Risk log.
All issues are documented, prioritized, tracked and managed through the Issue & Risk
Management Process Flow.

NASR Geographical Objective Reporting June 21, 2010 Page 14


S. Risk Mitigation Plan Test
No Phase
. Impacted
1 Delay in the start of Testing may Capgemini Offshore /India lead will work System
result in schedule delay of the closely with the Capgemini project Testing
project manager on progress of the project and
ensure that Testing starts on time.
2 Delay in Review and sign off of Capgemini Offshore /India lead with the Unit Testing
the Test cases by GM help of Capgemini project manager will Results,
work closely with GM project manager to System
get the Test cases reviewed on Time as Testing
per the Schedule.
3 Any major changes to Design / Capgemini team will work with GM All
Data Model or request for Project Manager and follow the change
additional functionality during request procedure to incorporate the
development phase will require a changes.
change in the effort, cost
estimates and the overall
schedule.
4 Non-availability of Test data Test Lead and Project Manager will work All
closely with GM team for the availability
of Test data and/or Test environments in
the Test planning phase to ensure that
environments and data will be available
before Testing starts.
4.5 Test Resource
The following table lists the resources involved in various System Testing activities:

S. Activity Resource
No.
1 System Test Cases Design Capgemini Test Team
2 System Test Cases Review Capgemini Test Team, Module Leads, GM Quality
Coordinator
3 System Testing (Functional Testing) Capgemini Test Team
4 Test Report and Metrics Report Capgemini and GM Quality Coordinator
5 Test Audit Capgemini Test Lead, Capgemini Project
Manager , GM Quality Coordinator and GM PM

4.6 Test Schedule


The test schedule is documented in The NASR Geographical Objective Reporting Integrated
Project Schedule

4.6.1 Overview of System Testing Schedule


 System testing process starts with the creation of Test plan and Test Cases.
 Once Test cases are prepared, they are reviewed

NASR Geographical Objective Reporting June 21, 2010 Page 15


 Once test execution starts, defects are logged to SourceForge.
 Test lead has to check the number of cycles required for system Testing, depending on the
defects logged and reopened
 Test lead needs to prepare the summary report of Test cases and number of defects from
those Test cases and should report to management.

Testing Milestone Schedule for NASR Geographical Objective


Reporting
S. No. Activity Responsibility Deliverables

1 System Testing Capgemini 1. Test Strategy


2.System Test Plan
3. System Test Cases Document
4. System Test Results or Defect
Summary Report

NASR Geographical Objective Reporting June 21, 2010 Page 16


5 Acceptance Criteria
5.1 Pass/Fail Criteria
Each Test case/step will identify the correct Expected Results.
The user will execute the Test Cases and confirm the Expected Results or the cause for
failure. The failure condition will be logged and reviewed by the project team for corrective
action.

Following will be the criteria for Acceptance of the system

 100% of the Test cases have been attempted.


 Execution of Test cases should be 100 %. This will be true only if all the external interfaces
are completely available during Testing.
 All defect fixes have been Tested and verified.
 All reported Defects remaining open have been published and reviewed with the Project
Team. The Core Project Team and Project Stakeholders approve the resolution plan for
these Defects.
 System should behave as per the specifications defined in the System Requirements
Document.

5.2 Suspension Criteria and Resumption Requirements

Testing will be suspended due to the following circumstances:

 Non-availability of the external sites to Test the interfaces of the NASR Geographical
Objective Reporting to other sites.
 Un-scheduled Technical disruptions to the Testing Environments.
Testing will be resumed immediately upon the environment coming back online and will
continue from the point at which it had stopped.

Testing will also be suspended as follows:

 When a Tester escalates a Test defect to a Priority Level 1 (“catastrophic”) or Priority Level 2
(“Major functions disabled, workaround not available”), the project team will call a break in
Testing while an impact assessment is done.

Test Resumption:

Once a defect is resolved, the Project Team is notified. The Project Manager calls a meeting of
the development team, operations and the Test team. An action plan for implementing the fix in
the Test environment is established. A time for Test resumption is established.

Once the fix is successfully implemented, the Project Team is notified and informs the Test Team
that Testing can be re-started.

NASR Geographical Objective Reporting June 21, 2010 Page 17


6 Test Tracking and Reporting

6.1 Recording Testing Events


 Test cases will be executed (System Testing) and updated with test results.
 Defects detected during the Testing phase will be logged in shared tool – Source Forge
and will be made available for the Developers for fixing. Fixed defects will be Re-Tested
and closed by the Test team.

6.2 Exception and Defect Handling


An unexpected Test event is something unplanned that occurs during Testing which could
impact the Test team’s ability to successfully continue with Testing (i.e. Test data based on
which documented Test results are calculated on is accidentally deleted).

To handle unexpected Test events the following steps will be performed:


1. Project team is notified immediately when an unexpected event occurs
2. Project team meets with Test Developer, DBA, Project Manager, Test Team and 3rd party
vendors to discuss impact
3. Action plan to recover or compensation for the event is identified and executed

To handle Defect resolution the following steps will be performed:

1. Once a defect is identified during Testing, the Tester will log the defect, tie it to the
appropriate Testing phase and assign a Severity and priority to the defect. System Testing
defects will be communicated by the Testers to the Developers through Defect Log. Test
lead will ensure that the Defect logged is a valid defect.
2. Those defects that must be fixed prior to deploying this release will be documented and
assigned an owner by the Application Test team based upon the Defect type assigned.
3. The Project Team will discuss the Error with the Defect owner and provide a copy of
documentation surrounding the Defect with a Screen Shot if it’s a Critical Defect.
4. The Defect owner will analyze the defect and establish a fix and an effort Estimate for
completing the fix.
5. The fix and the effort estimates will be reviewed with the Project team. The Project team
will evaluate and approve those fixes that must be incorporated into the second round of
Testing.
6. If an estimate for a fix exceeds the time available to meet the schedule, the issue will be
escalated to the Project Manger. The PM will either approve the fix and the change to the
project schedule, or reject the fix. If PM decides to modify the project schedule based on a
defect, the appropriate stakeholders will be notified prior to the modification.
7. The application Test team will determine the configuration management impact of the
approved Testing change control. Document owners will be informed of the need for
updates.
8. Once a fix and its time estimates are accepted, the owner will complete the fix and Test it.
9. The owner will provide a fix impact statement to the various document owners so that
documentation may be updated to reflect system changes.
10. The application Test team will review the Test results and impact statements and approve
the fix for release into the next round of Testing.
11. After all of the required fixes are approved by the application Test team, the Project team
will either resume a suspended Testing round, or the next round of Testing begins.

NASR Geographical Objective Reporting June 21, 2010 Page 18


12. Defects found during Testing will be tracked and documented for the consolidated Test
Results review report.
6.3 Test Progress Planning and Tracking

A Daily status meeting will be held with testing teams (Project Manager, System Testing
team, to ensure Test progress is tracked as per schedule. These meetings will also be used
for the Test lead to provide Defect logs and to discuss Defects and Defect resolution.

A Test Report will be prepared at the end of each phase of Testing by Capgemini, using the
Defect tracking tool- SourceForge. This will contain module-wise Defect statistics with the
priority levels. The report will also contain metrics specific to Testing as mentioned below:

Metric Definition
Defect Density Indicates number of defects per Use case points.
Rate

Informal Testing progress Reports reflecting # of Test cases planned, # of Test cases
executed, # of Test cases passed, # of Test cases failed, # of defects opened in the
corresponding Testing cycle, # of defects closed in the corresponding Testing cycle and the
status of critical defects will be made available to GM in the form of email communication or
project updates on a pre-agreed frequency during the Testing cycle.

NASR Geographical Objective Reporting June 21, 2010 Page 19


6.4 Attachments

7.1 Appendix 1- Roles & Responsibilities

Type of Environment Location Primary Secondary


Testing Owner Owner

Unit Testing Development Supplier Capgemini None


Location

Integration Development Capgemini Capgemini Not Applicable


Testing

Regression Pre-Prod Supplier Capgemini Not Applicable


Testing Location

System Testing Pre-Prod Supplier Capgemini None


Location

System Pre-Prod Supplier Capgemini None


Integration Location
testing
UAT Pre Prod Supplier GM Capgemini
Location

Performance Pre-Prod Supplier Capgemini None


Testing Location

Technical Production Supplier Capgemini/HP/GM None


Launch Location

NASR Geographical Objective Reporting June 21, 2010 Page 20

You might also like