Professional Documents
Culture Documents
NASR SystemTestPlan-v0.1
NASR SystemTestPlan-v0.1
Approved By:
Approval Date:
This System Test plan will help the team to track progress to the goals, to identify and control
risks and to maximize the success of the System Test phase and consequently, the success of
the whole project.
This Plan also lists out the various environments that would be used in the different phases of
System Testing, the resources; and their roles and responsibilities.
The functionalities that will be tested as part of this project are summarized below at a high level:
Sending vendor flat file format change
Storage of Monthly Objective files by NASR
NASR “Blocking” flag for Model and Geography level reporting
NASR Mid-month BAC objective changes
Reprocessing the 7 geographical levels at mid-,month
Summing historical objectives
Data load Status screen updates
NASR reporting changes based on the geographical levels and blocking settings and footer
change from “GMIA” to “SRDM”.
This System Test phase will be managed and executed by the Capgemini Testing team.
1.2 Assumptions, Exclusions & Constraints
Assumptions:
1. The Development task should be completed for the respective work streams as per the
High Level Schedule.
2. Unit and Component integration testing are completed
3. All the Defects from the Requirement Phase to the Coding Phase are fixed and resolved.
4. All the code is centrally stored in PVCS.
5. All the Test Environments are stable as expected.
6. The System Testing will be performed in the Pre-Production environment
7. Production data and necessary Login IDs are provided before System testing in Pre
production environment. (Note: pre-production data is essentially a copy of the NASR
production data).
8. All the necessary testing resources are available during testing.
Exclusions:
1. NASR reports and functionalities outside the scope of the requirements identified in the BRD
and SRS will not be tested. Examples of these are reports generated on data retrieved from
MADM, Employee data, etc. are not impacted by this project and will not be tested in this
phase.
Constraints:
1. Timely closure of all component Integration Testing issues is necessary for the completion of
System Test as per schedule
2. Sending Vendor should send the file in a timely manner to Pre-prod environment prior to
systems integration testing.
3. Hosting team should be providing timely support to setup Pre-prod environment
4. There should be no delays from CAB for moving code to Pre-prod.
Entry Criteria:
Ensure the availability of approved Test Plans.
Ensure the availability of approved Test Cases.
QA Test Lead review is available.
Validate that all the Defects from the previous Testing phase have been fixed.
Verify Pre-prod environment is available for testing as applicable.
Validate all hardware and software components in the environment are accurate as specified
in the Test plan. If Test plan is in error, a change control must be submitted.
Confirm the application installed is the correct version for Testing.
Identify if there are other activities in the environment which may impact testing. Ensure that
the testing schedule is adjusted if necessary to avoid conflict of resources
Conduct preliminary (Sanity) Test to ensure installation is successful
Ensure Data required for Testing is available.
Confirm all users ID’s to be used in Testing have been built and perform accurately.
Obtain a sign-off from the Development Team to conduct Testing.
Obtain change management approval for moving code to Pre-prod for testing.
Triage Process.
If the Development Manager feels that a defect is not valid, he should bring it to the notice of the
Test Manager. If both are not in agreement, then the defect will be brought into a triage meeting
to determine a proper course of action. The Development Manager, Test Manager and Program
Management should attend the regular triage meetings. Triage meetings will be scheduled to
start once testing and reviews of defects begin. It is understood that triage meetings are used to
prioritize defects, re-assign defects (if needed), and come to agreement on defects under
dispute. Defect triages are not meant to be used to discuss defect fixes. Development Manager
is in charge of setting the agenda for this meeting.
Sanity Testing: The system will be allowed to run normally for a while to ensure normal
performance, without any abnormal errors/logs.
Regression Tests: System testing includes running Regression tests which are based on NASR
Geographical Objective Reporting last Release functionality.
Functional Tests: This covers all the functionalities developed as part of NASR Geographical
Objective Reporting project.
Data Conversion and Data Load Testing: this test will cover the validation of data in the db
tables related to the 7 geographical level objectives
System Integration Tests (SIT): SIT test cases include test steps covering external
components.
Project: INE03_GMAD_NASR_Obj_AM Documents > NASR Geographical Objective Reporting
> Plan and Define > Baselined
System testing includes the complete Test preparation procedure that is described in the
following sections.
The Test execution sequence describes the Test scenarios, Test cases and Test data that will be
tested. The sequences are developed as follows.
National, Region, Section, Area, Zone, GMMN and BAC will be loaded before the beginning of
the System Testing; Database Development team has taken the responsibility of loading this
data. Necessary User IDs are must for the beginning of system testing. At least one ID for each
Role is needed for testing purpose. Sustain team creates the login IDs per the request.
2.1.5 Validation
The system Test cases will be validated with the Test data available in the database for system
Testing. This validation requires Data base access to testing team. In case the access is not
available to the testing team, the validation will be done with the help of Development/Sustain
team.
The Test cases review and preparation is one of the key drivers for the entire project. The current
approved version of the System Requirements Specification will be the Reference document for
the Test cases
The Capgemini Test team will Test cases to ensure cover all the enhanced features of NASR
Geographical Objective Reporting During Project Estimation, Test case preparation activity is
estimated based on a set of data extracted from Business Requirement and System
Requirement documents.
Estimation Approach:
3 Test Preparation
3.1 Test Environment
Each component developed is required to undergo sufficient Testing at each phase. The team
needs to create and review Test cases, Test data as an entry point to a Testing phase and
successful Test results as an exit point from a Testing phase before migrating to the next phase
of Testing
3.1.1 Hardware
3.1.2 Software
The spreadsheet below contains the Hardware and Software Requirements for Testing
Hardware and
Software Requirements.XLSX
3.1.3 Tools
Source Forge for the Defect management and PVCS for the code Repository are the Tools Used.
For PVCS:
http://10.156.0.180:8080/GM2006_prod?jsp=login
New testing resources will be trained on the application to bring them to the speed in order to run
the test cases
Following is the list of pre-requisites that have to be met before the Test Team can conduct
System Testing:
Please refer to the Project Schedule which can be found in Source Forge under: “SDP-R10
Deliverables/Project Plan/Project Plan Schedule (NASR)” for NASR Geographical Objective
Reporting planned schedule for Test preparation activities.
Following are the list of pre-requisites that have to be met before the Test team can conduct
System Testing (ST):
4 Test Execution
Test Cases will be designed by testing team according to the new requirements as per the BRD
and SRS.
System Testing will take place over a period in which Testers will undertake a dedicated effort to
Test the system against specific requirements. All system Testing must occur against
documented Test Cases.
System Test cases will be prepared by the Testing team in Capgemini. The Test Lead will review
the Test cases and manage the Testing team. The main responsibilities of the Test Lead are:
Regression Testing – Set of regression test cases based on NASR existing functionality.
GUI and Functional test - Test cases that ensure the component functions are as required.
Negative Testing: To ensure that application handles the wrong input properly
Boundary Value testing: All the Text fields are tested for boundary values if the values are
available in SRS.
Error Conditions - Test cases that ensure the component handles Error conditions.
Exception – Test cases that cover special logic and/or exceptions.
Unexpected Data Input – Test cases to cover entry of invalid characters example: user
entering illegal characters in their user ID, or unsupported characters in their address, etc.
Security Level Validation: Test cases will include the Security checks or the Privileges
given to the User. As per the Security Requirements system works on role based access and
data level authentication is done based on dealer code and sales organization.
A Penetration Test is a process involves an active analysis of the system for any potential
vulnerabilities that may result from poor or improper system configuration, known and/or
unknown hardware or software flaws, or operational weaknesses in process or technical
countermeasures. This analysis is carried out from the position of a potential attacker, and can
involve active exploitation of security vulnerabilities. Any security issues that are found will be
presented to the system owner together with an assessment of their impact and often with a
proposal for mitigation or a technical solution.
The HP Weblogic tool will be used to conduct the Web Scan for security testing.
Web application penetration testing refers to a set of services used to detect various security
issues with web applications
Following tasks will be performed as part of NASR Geographical Objective Reporting Testing
activities:
Test Team will use the system Test cases to perform System Testing.
The status of each Test case will be recorded in the Test case document.
Defects will be raised in a shared defect log tool (in source Forge) for all failed Test cases.
Each defect will be tagged with a priority. The priority levels that will be used are mentioned
in the table below.
Fixed Defects will be verified by the Test team and closed in (in source Forge) shared folder.
The status of failed Test cases will be updated in the Test case document. In case of a wrong
fix, the Defect will be reopened and reassigned back to the Developer. This cycle will be
followed until the Defect is properly fixed and closed.
At the end of Testing, Test Lead will prepare a Test Report that provides details about the
Defects found during Testing. This will be reviewed by Capgemini Development Manager
and Test Coordinator and delivered to GM Project Manager.
All project risks and contingencies have been documented within a separate Issue and Risk log.
All issues are documented, prioritized, tracked and managed through the Issue & Risk
Management Process Flow.
S. Activity Resource
No.
1 System Test Cases Design Capgemini Test Team
2 System Test Cases Review Capgemini Test Team, Module Leads, GM Quality
Coordinator
3 System Testing (Functional Testing) Capgemini Test Team
4 Test Report and Metrics Report Capgemini and GM Quality Coordinator
5 Test Audit Capgemini Test Lead, Capgemini Project
Manager , GM Quality Coordinator and GM PM
Non-availability of the external sites to Test the interfaces of the NASR Geographical
Objective Reporting to other sites.
Un-scheduled Technical disruptions to the Testing Environments.
Testing will be resumed immediately upon the environment coming back online and will
continue from the point at which it had stopped.
When a Tester escalates a Test defect to a Priority Level 1 (“catastrophic”) or Priority Level 2
(“Major functions disabled, workaround not available”), the project team will call a break in
Testing while an impact assessment is done.
Test Resumption:
Once a defect is resolved, the Project Team is notified. The Project Manager calls a meeting of
the development team, operations and the Test team. An action plan for implementing the fix in
the Test environment is established. A time for Test resumption is established.
Once the fix is successfully implemented, the Project Team is notified and informs the Test Team
that Testing can be re-started.
1. Once a defect is identified during Testing, the Tester will log the defect, tie it to the
appropriate Testing phase and assign a Severity and priority to the defect. System Testing
defects will be communicated by the Testers to the Developers through Defect Log. Test
lead will ensure that the Defect logged is a valid defect.
2. Those defects that must be fixed prior to deploying this release will be documented and
assigned an owner by the Application Test team based upon the Defect type assigned.
3. The Project Team will discuss the Error with the Defect owner and provide a copy of
documentation surrounding the Defect with a Screen Shot if it’s a Critical Defect.
4. The Defect owner will analyze the defect and establish a fix and an effort Estimate for
completing the fix.
5. The fix and the effort estimates will be reviewed with the Project team. The Project team
will evaluate and approve those fixes that must be incorporated into the second round of
Testing.
6. If an estimate for a fix exceeds the time available to meet the schedule, the issue will be
escalated to the Project Manger. The PM will either approve the fix and the change to the
project schedule, or reject the fix. If PM decides to modify the project schedule based on a
defect, the appropriate stakeholders will be notified prior to the modification.
7. The application Test team will determine the configuration management impact of the
approved Testing change control. Document owners will be informed of the need for
updates.
8. Once a fix and its time estimates are accepted, the owner will complete the fix and Test it.
9. The owner will provide a fix impact statement to the various document owners so that
documentation may be updated to reflect system changes.
10. The application Test team will review the Test results and impact statements and approve
the fix for release into the next round of Testing.
11. After all of the required fixes are approved by the application Test team, the Project team
will either resume a suspended Testing round, or the next round of Testing begins.
A Daily status meeting will be held with testing teams (Project Manager, System Testing
team, to ensure Test progress is tracked as per schedule. These meetings will also be used
for the Test lead to provide Defect logs and to discuss Defects and Defect resolution.
A Test Report will be prepared at the end of each phase of Testing by Capgemini, using the
Defect tracking tool- SourceForge. This will contain module-wise Defect statistics with the
priority levels. The report will also contain metrics specific to Testing as mentioned below:
Metric Definition
Defect Density Indicates number of defects per Use case points.
Rate
Informal Testing progress Reports reflecting # of Test cases planned, # of Test cases
executed, # of Test cases passed, # of Test cases failed, # of defects opened in the
corresponding Testing cycle, # of defects closed in the corresponding Testing cycle and the
status of critical defects will be made available to GM in the form of email communication or
project updates on a pre-agreed frequency during the Testing cycle.