Professional Documents
Culture Documents
Manual Part 2
Manual Part 2
Test Procedure/Design
Tester – Tester Project Level
Test Script/ Execution document
(Test Proof)
Defect Report
Test Policy-
Test policy documents will be prepared by Testing Head -TH
Test policy documents defines objective of the project (How much revenue we can
generated from the Testing)
Test policy documents it is Company Level document
Test Strategy-
In Test strategy stage Test Strategist (TS) & PM will work
In Test strategy defines strategy/ approaches against the project (Ex. Automation- Java
langue’s + Selenium tool)
Test strategy documents it is Company Level document
Test Methodology-
In Test methodology stage PM will work
In Test methodology defines on different stages / environments which testing we can
performed
In Test methodology stage PM will prepared TRM (Test responsibility matrix)
Test Plane-
In Test Plane stage Test lead will work
Test Plane documents it is Project Level document
Test plane documents defines resource (Tester/ Environments), resources allocation,
Job allocation, estimation, etc
Test Summery
Report / Test
Closer
Test Initiation-
Test Plane-
In Test summery report- Every US –TCD, TCE, TCE, No. of defects found
In Test closer – Final report against the Module, which prepared by Test lead
Testing Process-
BRS
Development Testing
SRS/FRS/CRS
(Install Build)
Test Plane-
Test plane that will be prepared by Test lead
While preparing the Test planes there parameter focus
1. Resource allocation
2. Job allocation
3. Estimation
4. Main purpose of Test planes to decide Start date and end date of testing
Interview question –
SRS/ FRS
Use Case/
User Story Test
Scenario Test cases
BA BA Tester Tester
SRS/ FRS-
SRS defines as software requirement specification/ functionality requirement
specification
SRS defines functionality requirement to be development & system requirement that will
be used
SRS will be prepared by BA
SRS will be derived from BRS
SRS will contains
1. Functional requirement
2. Functional flow diagram
3. Use case
4. Screenshot
Ex. SRS document will be papered for a module (ex. SRS will contains 25 use case)
Use case-
Use cases defines specific single requirement
In Agile Use cases are called User story
Use cases will prepared by BA
Use cases will be derived from SRS
Use case will contain
1. Description – detail about the Use cases/ User story
2. Acceptance criteria - Does & Don’t about the US
Ex. Paytem- Fasttag – Use Case- Fasttage for 4 wheeler vehicle
Test scenario-
Username-
Password-
Submit
Test scenario-
1. Verify username text box by passing mobile no. in login page
2. Verify username text box by passing email id in login page
3. Verify password text box by passing comabination4 to 6 charter in login page
4. Verify Submit button functionality in login page
Test cases-
Ex. Test scenario - Verify username text box by passing email id in login page
Ex. Test cases-
1. Verify username text box by passing @gamil.com id in login page
2. Verify username text box by passing @yahoo.com id in login page
3. Verify username text box by passing @hotmail.com id in login page
4. Verify username text box by passing @outlook.com id in login page
5. Verify username text box by passing @rediffmail.com id in login page
6. Verify username text box by passing @company.com id in login page
7. Verify username text box by passing invalid @gamil.com id in login page
8. Verify username text box by passing invalid @yahoo.com/ @hotmail.com id in login
page
9. Verify username text box by passing null/blank @gamil.com id in login page
Ex. Test scenario - Verify password text box by passing comabination4 to 6 charter in
login page
Test cases –
1. Verify by passing Abcdef charter into password text box in login page
2. Verify by passing ABcdef charter into password text box in login page
3. Verify by passing ABCdef charter into password text box in login page
4. Verify by passing ABCDef charter into password text box in login page
5. Verify by passing abcdEF charter into password text box in login page
6. Verify by passing abCDef charter into password text box in login page
7. Verify by passing @abcde charter into password text box in login page
8. Verify by passing abcd@ charter into password text box in login page
9. Verify by passing abc@de charter into password text box in login page
10. Verify by passing ab@ charter into password text box in login page
11. Verify by passing abcdefsgrhf@ charter into password text box in login page
12. Verify by passing null/blank charter into password text box in login page
Test Cases-
In my project, Test cases we will writes in the excel sheet
Excel sheet fields
1. Test cases ID
2. Priority
3. Reference
4. Test scenario/ Title
5. Pre-request/ Pre-condition
6. Test data
7. Test cases
8. Test cases steps / Step of test cases
9. Expected result
10. Comments/ Suggestion
11. Actual result
12. Pass/ Fail Criteria
13. Defect ID
Test scenario-Verify by passing mobile no. into Username text box of VCTC login page
Test cases-
1. Verify by passing 9 series mobile no. of student & staff into Username text box of VCTC
login page
2. Verify by passing 8 series mobile no. of student & staff into Username text box of VCTC
login page
3. Verify by passing 7 series mobile no. of student & staff into Username text box of VCTC
login page
4. Verify by passing 6 series mobile no. of student & staff into Username text box of VCTC
login page
5. Verify by passing 9 digit mobile no. into Username text box of VCTC login page
6. Verify by passing 11 digits mobile no. into Username text box of VCTC login page
7. Verify by passing other mobile no. of student & staff not belong to vctc domain into
Username text box of VCTC login page
8. Verify by passing null/ blank into Username text box of VCTC login page
9. Verify by passing charter/ String into Username text box of VCTC login page
Test scenario -Verify by passing combination of 6 to 8 charter into Password text box of VCTC
login page
Test cases-
1. Verify by passing 6 charter combination into password text box 6 charter combination
should be accepted in Password filed
2. Verify by passing 7 charter combination into password text box 7 charter combination
should be accepted in Password filed
3. Verify by passing 8 charter combination into password text box 8 charter combination
should be accepted in Password filed
4. Verify by passing 5 charter combination into password text box 5 charter combination
should not be accepted in Password filed and error message - enter valid data
5. Verify by passing 9 charter combination into password text box 9 charter combination
should not be accepted in Password filed and error message - enter valid data
6. Verify by passing null/blank values into password text box Null/blank combination
should not be accepted in Password filed and error message - enter valid data
7. Verify by passing 6 charter combination all special charter into password text box 6
charter combination all special charter should not be accepted in Password filed and error
message - enter valid data
8. Verify by passing 6 charter combination all Capital charter into password text box 6
charter combination all Capital charter should not be accepted in Password filed and error
message - enter valid data
9. Verify by passing 6 charter combination all Small charter into password text box 6
charter combination all small charter should not be accepted in Password filed and error
message - enter valid data
10. Verify by passing 6 charter combination all Number charter into password text box 6
charter combination all Number charter should not be accepted in Password filed and
error message - enter valid data
11. Verify by passing 6 charter combination will contains Number, Capital & Small letter
into password text box These combination of Number, Capital & Small letter should
not be accepted in Password filed and error message - enter valid data
12. Verify by passing 6 charter combination will contains Special, Capital & Small letter into
password text box These combination of Special, Capital & Small letter should not be
accepted in Password filed and error message - enter valid data
Test case-
Go to - https://artoftesting.com/chair
Test cases for the PEN-
2. Peer review-
When tester will write the test cases then seniors tester/ colleague/ Test lead do the
review these review called peer review
Tester will sent the Mail to the seniors tester/ colleague/ Test lead for peer review
3. Internal review-
When tester will write the test cases then these test cases will be review by BA
Tester will sent the Mail to the BA for Internal review
Tester will set up a meeting with BA for internal review
4. External review-
When tester will write the test cases then these test cases will be review by Client/ UAT
team
External review meeting will be set up by test lead
Test lead will start the meeting & individual tester will review the test cases
External meeting involved- Test lead, Client/ UAT Team, Tester, Developer
Traceability Matrix-
Traceability matrix against the Test case & defect for a US/ requirement
Traceability matrix defines linking or mapping against the Test cases & defect with
US
Traceability matrix are 2 types
1. Forward Traceability matrix
2. Backward Traceability matrix
1. Forward Traceability matrix-
Test cases will be linked or mapped with US/ requirement
Test cases we will writes in excel sheet, these excel sheet we will link or attached with
the US, it is called forward traceability matrix
Forward traceability matrix will prepared in (Test cases will be linked or mapped with
US) in JIRA/ HPALM
Closed
Fix
Re-open
Differed
Tester Developer Developer Tester
New- When tester will found error in the application, then he will raised the
defect. Defect will raised in JIRA/ HPALM and Tester will mail to developer
throw JIRA/ HPLAM
Tester will mark defects status as New in JIRA/ HPALM
Open- When developer is working or analysis on the defects, developer will mark
defect status as open
Fixed- When developer found that defects is valid then he fix the defects and
developer will mark defect status as Fixed
Developer will mail to Tester throw JIRA/ HPLAM
1. Closed- In fixed defects tester will do re-testing and regression testing. If
defects has been fixed. Tester will mark defect status as Closed
Tester will mail to developer throw JIRA/ HPLAM
2. Re-open - In fixed defects tester will do re-testing and regression testing. If
defects has not fixed. Tester will mark defect status as Re-open
Tester will mail to developer throw JIRA/ HPLAM
Duplicate defects-
Duplicate defects is nothing same/ similar defects we will raised again
Ex. Paytm- BSNL mobile no. is not accepting for all state – Defect ID - 1002
Ex. Paytm- BSNL mobile no. is not accepting for Maharashtra state – Defect ID –
1003 (Duplicate defects)
Ex. Paytm- BSNL mobile no. is not accepting for MP state – Defect ID – 1004
(Duplicate defects)
Re-producible defects-
Re-producible defects/ know defects defines, defects which is produce again and
again in the testing
Re-producible defects occurred due
1. Environment problem i.e. some functionality not support in SIT environments.
2. Database connectivity problem
3. Deployment process (Developer deployment into SIT environment)
Bug released- Application we will deploy or delivery to client with knows defects.
Defects Priority & Severity should very low
Report-
In my project, we have following report-
1. Test cases execution/ Test proof Tester
2. Defect report/ Bug report Tester
3. Test summery Report Test Lead (Tester help for to Test Lead)
Interview question-
1. What is Test plane? Have you involved in Test Plane prepared?
2. What is test scenario & Test cases?
3. What is User story & Use Case?
4. On which parameter, you will consider while writing test cases?
5. Writes the test cases for things? Object- Pen, scenario based test cases
6. What are the field present while writing the test cases
7. What are different types of Review? And how you are conducting?
8. If you got suggestion in test cases review? How you take care of these?
9. What is traceability matrix and where you will prepped
10. What is defect life cycle?
11. What is duplicate & re-producible defect?
12. What is a show stopper defect?
13. What are different report you are preparing in you project?
5. What problems you have faced in you last project? OR what problem you have
faced in your testing carries?
Answer- In testing
If we have less domain knowledge about the project
If we have less resource is the project
If we have the environment problem
If developer is not communicated properly
Client Frequent changes in the requirement
If we have less test data for testing
If developer is not informing while doing deployment
6. How many test cases you will write per & how many will execute?
Answer- Test cases design it is totally depending on US or functionality
An a average, I will write 25 to 30 test case per day
An a average, I will execute 15 to 20 test cases per day
An a average, I will writes test script 4 to 5 per day in automation
7. How many test cases you will write per & how many will execute?
Answer- I have written many test case in my last project
I know exact number but I rember as I have Test cases No- 145 to 265
1 Year = 24 to 25 Sprint
1Sprint = 3 to 4 US
1US = 25 to 30 test cases
8. How many defect you will raised per day OR per US?
Answer- It totally depends on the test cases execution and coding of the developer
If developer will done more mistake or error in coding then we get more defect
If developer will done less mistake or error in coding then we get less defect
An a average, I will raised 4 to 5 defects per US/ day
9. What is your approach, if environments defect found in SIT & not found in Dev?
Answer- When tester, found a defect in SIT then we will raised/ create defect in JIRA/
HPALM with screenshot & mail to developer
Developer says to is not a defect and he will take call with tester. For showing casing the
defect which is not present in Dev environments
SO in call Tester will shows case the defect in SIT environments?
Defect occurred in SIT due to–
1. Deployment process
2. Due to Cash & cookies in the browser Tester will clear the cases & cookies and
Testing OR Test the application in “Incognito Window”
11. How many test cases are present in your regression suite?
Answer- Manual Test module- 30 US
1 US = 25 to 30 TCD
Overall total manual Test cases = 30 * 30 = 900 TCD
Automation Regression suite = 180 or 200 Test cases
12. Where we got the log after getting error/defect/issue in the application? OR Where
we got the log
Answer- In my project, while doing the testing, if we got error in application. Then we
will get these log in workbook (Team channel-Project against error)
If we openwork book then we will indentify where is the problem in the application/
functionality