Testing Approach

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Testing Approach

The proposed testing process will follow the detailed testing planning and execution cycle for a System Test Life Cycle
(STLC). A brief overview of the testing planning is provided below.

Base lining of artifacts


Following artifacts shall be base lined by the QA Architect in agreement with BA and the identified stakeholders, before
starting the QA phase.

1. Requirements Document.
2. Architecture and Design Documents.
3. End to End Use Cases.
4. Module / Sub-System level interfaces.
5. Security Considerations.

Test Strategy & Test Planning


For the Test strategy & planning phases, a systematic approach that shall help in identifying all the test scenarios, leading
to an increase in the effectiveness of test scenarios to detect defects.

User Scenario & Data Coupling &


Test Data
Application Use Cases Control Succession
Identification Final Sign-Off
Scenario Modelling Coupling Analysis
(Multi-tenancy)
Analysis Analysis
• User Scenario Analysis: Study different types of users operating on every subsystem and perform User Behavior
Analysis to understand all the actions that shall be performed by different users. User can be Humans interacting
with the Sub Systems, Other Subsystems interacting and also the Infrastructure that can affect the state or behavior
of the system.

• Application Scenario Analysis: Study different modes in which a sub system can be configured and shall be
operating in the entire eco system. Associate modes / configurations with different users and identify all the
combinations. This provides a detailed analysis of different configurations of the sub systems for every user. This
type of analysis is very important when the systems are multi-tenant.

• Use Cases Modelling: Identify all Use Cases for every User & Application Scenarios & document all the use cases
as test scenarios

• Data Coupling and Control Coupling Analysis: Study the Architecture and Design and identify all the interfaces
at sub systems level. Model the entire solution at a black box level to understand external data input into and output
from the solution. Based on this, analyze the flow of data across internal sub systems and the way each data point
affects the state / operation of the subsystem. This ensures the data integrity is maintained and each subsystem is
robust enough to ensure there are no DOS (Denial of Service) or Data Loss possibilities.

• Succession Analysis: After all the above analysis is done, team shall analyze the existing Test Strategies, Test
Scenarios & Test Data to assess the test sufficiency achieved in the existing versions live. If there are any major gaps
identified, test team shall provide details on the gaps along with a plan to close the same. This shall be considered as
a technical debt and shall be taken as a separate phase outside the current project testing scope.

• This analysis shall specifically focus on the change modes (Change-Independent, Change Impacted,
Impacted, impacts to other feature, independent) on all features, assess the feature maturity, failure impacts
etc. to identify new test of test cases without any redundancy.

• Test Data Identification (Special Focus on Multi-Tenancy Features): This shall include the study of Tenant
Specific Data and the provisions in the solution to ensure the data protection and prevention of data corruption
aspects are considered. This also shall focus on identifying the Most optimal combination of Data and
Configurations which can provide high data coverage

• Test Cases Writing: Document all the test cases to be used for execution using the Use Cases or Test Scenarios
identified and associating with the Configurations and Data identified in previous steps.

• Test Execution & Defect Reporting: Execute the test cases as per the identified plan on the version identified for
testing. Log any defects found in the process and track as per the Defect Management Guidelines as set by
stakeholders.

• Final Sign-Off: Execute all the identified Acceptance Test cases on Target environment as agreed upon between
stakeholders and Happiest Minds team. Obtain sign-off based on meeting the PASS criteria as agreed upon

Testing Levels:
The overall testing will be carried out at 3 different levels:
 API testing
 Sprint Level
 Component / Sub System Level Testing
 E2E Integration Testing
 User Acceptance Testing

API testing
This testing focusses on the validation of API data using postman to verify the API’s are working as per the
requirements.

Sprint Level Testing


This testing level focusses on the testing stories taken up for implementation in a sprint.
Every story will be analysed by the QA team, for impact on already existing functionality as part of Feature Interaction
Analysis (FIA) and the impact on other stories in the same sprint as part of Story Interaction Analysis (SIA)
Test Scenarios, Data and the test cases will be derived at a story level and the same shall be executed against the build
Transfer-to-Test (T2T).
The Story level test cases and related execution metrics are traced to a Feature in “Feature and Requirement Catalog” that
is maintained by the QA Team for E2E Traceability of the Features to Requirements -> Stories -> Test Cases.

Component / Sub System Level Testing


This testing level focuses on the validation of component / module / sub system and its compatibility with all the other
components / modules / sub systems, as part of the overall solution. This testing is done by an independent Integration
tester outside the sprint. Focus of this level is to ensure the Sub System is considered in a holistic manner w.r.t E2E
integration scenarios rather than a narrow focus on Story.
Objective here is to ensure that the sub systems conform to all the requirements (Functional & Non-Functional),
interfaces and the input & output sanitization requirements.
Following are the activities carried out in this testing phase:
 Installation and Configuration of the sub system from the repository by following the Developer Guide /
Other Documentation
 Validate all the interfaces exposed for compatibility with other sub systems in QA and Pre-Production
environments
 Selective regression of the test cases for the component identified from Sprint Test Cases.
 The triggers for test on different components will be analysed in detail and the test logic will be performed
and validated. In case where there are no interfaces available for testing a component / sub system
independently, multiple components / sub systems will be clubbed together and identified as a single sub
system.
 Test Analysis, Design and Execution of Component level test cases. This is needed as the stories are
generally sliced vertically focused on the End User functionality rather than Component Functionality.

E2E Integration Testing


This testing level focuses on the E2E Use Cases that will be arrived at with discussion with the Business Analyst and
Customer Stakeholders. The focus is at an overall solution level such as - a new device is registered by a customer in the
Web Portal and the device starts emitting different a data and the information is visible to the admin interested in viewing
details for a particular account.
User Acceptance Testing
This testing level focus on the acceptance testing with Stakeholders on successful completion of which, the identified
artifacts are handed over to Stakeholders.
UAT test cases will be identified at the Story, EPIC, Component and E2E User Scenario level and will be shared with the
stakeholders. The final UAT cases are agreed upon between Happiest Minds and stakeholders teams.
UAT will be performed in Pre-Production environment with actual devices or simulators as is agreed upon between
Happiest Minds and stakeholders teams in the scope of project.

Review Test Results


Test results shall be reviewed in Test Execution Status meetings with focus on Test Execution progress, Defects found,
assessment of defect severity and reporting of issues.

Changes in Base lined Artifacts


Any changes to the base lined artifacts that are referred for Test Strategy Document, shall be treated as a CR and should
be approved the sponsors.
Based on the nature of changes, impact to Test Plan and Test Case document shall be assessed and the documents shall
be updated. The changes in the QA artifacts have to undergo the same approval process as followed in the approval of
this document.
Changes in Test Strategy and Test Plan
The changes in the Test Strategy and Test plan have to undergo the same approval process as followed in the approval of
this document
Changes in User Acceptance Test Case Document
Any changes to the base lined User Acceptance Test Case document shall be approved by the UAT Approval team
identified with Happiest Minds and Stakeholders teams.
The changes can be in the nature of addition / modification / removal of test types, test cases, changes in test data etc.

You might also like