Professional Documents
Culture Documents
Document a guideline for the QA improvement process
Document a guideline for the QA improvement process
Phases to rollout
1. Define the process
2. Give training about the process
3. Participate in sprint planning meeting to help QA
a. How to actively participate in understanding of the sprint deliverable items
b. How to define the scope and identify the impact areas
c. How to
create test executons under test plan based on number of test cycles run on the environemtn
test execution n1
test execution n2
test execution n3
...
I hope you are charged for a new week ahead. Let's start with the Definition of Done (DoD). Yeah, I
know it doesn't sound that exciting, but trust me, with a good DoD you will be able to accomplish
more in the same amount of time.
But first, let's hear what Scrum Guide has to say about DoD:
The Definition of Done is a formal description of the state of the Increment when it meets the
quality measures required for the product.
If a Product Backlog item does not meet the Definition of Done, it cannot be released or even
presented at the Sprint Review.
If the Definition of Done for an increment is part of the standards of the organization, all
Scrum Teams must follow it as a minimum.
To fulfil these requirements, some of the PO and SM volunteered to put together standard DoD for
DPL. This DoD was then reviewed and agreed by all PO and SM. Now it is being presented to you to
incorporate in your projects.
PO & SM - Please review this DoD with your team and get it incorporated in your projects within this
week (e.g. each story must only be marked done if all these items are checked). If a project requires
more stringent rules, then please go ahead, and add those requirements to the list. You must not,
however, remove any item from the standard DoD.
QA Improvement - Path to Success
What is QA?
• Knowledge?
1. Working Software
3. No Surprises
4. No Bugs
Execution: Package Delivery
1. Backlog Management
2. Quality Assurance
• Processes
3. Production Support
Talha has some reports he use to send to the C and Executive level stakeholders
The PO is responsible for submitting team allocation data to the finance department in the last week of the month so
that the finance team can submit invoice in time. Here is the template.
https://dplit-my.sharepoint.com/:x:/p/waleed_r/EbBwJRrK01VEksFm2MwfbfMBh6DYqjvigNigJMzqWbTNjw?e=eJjeIz
Target: ZERO bug should report on UAT?
- Use a mature deployment method to ensure successful deployment on UAT via CI/CD
- DB schema/data
- Web/App deployment
- Challenges
- 90% (3)
- 70%-90% (1)
1. What is? What will be the out with the given input. -> Acceptance criteria of a
user story
1. Environment/Precondition
2. Input value
3. Output value/message
4. Where to check it
2. How to write?
3. Where to write?
4. How to manage?
6. Test report
8. Step 8: Pass/Fail
Challenges to writing Test Cases?
1. Time constraint
8. Time Constraint:
24. Plan its fix – [PO, QA, DEV], Final approval Customer
25. DEV
26. QA
34. Participate in backlog grooming sessions and define use cases at end of the
meeting
36. Need to spend time on the last deployment for which no dedicated time is
allocated
38.
How to write a Test Case?
7. Step 7: Pass/Fail
How to write a Test Case? Challenges
1. Time constraint
1. Load
1. Resource shortage
2. Tool
1. Step 1: Test Case ID -Test cases should all bear unique IDs to represent them. In
most cases, following a convention for this naming ID help with organization,
clarity, and understanding.
2. Step 2: Test Description - This description should detail what unit, feature, or
function is being tested or what is being verified.
4. Step 4: Test Data - This relates to the variables and their values in the test case.
In the example of an email login, it would be the username and password for the
account.
6. Step 6: Expected Result - This indicates the result expected after the test case
step execution. Upon entering the right login information, the expected result
would be a successful login.
• Process completed.
• Add defect with steps to reproduce, support images, video, content, etc.
• Log the bug and assign the sprint and the team.
• Process completed.
Test cases can measure many different aspects of code. The steps involved may also be intended to
induce a Fail result as opposed to a positive expected result such as when a user inputs the wrong
password on a login screen.
A messaging system can be judged on its performance in four aspects—scalability, availability, latency,
and throughput. These factors are often at odds with each other, and the architect often needs to figure
what one aspect to compromise to improve the others:
• Scalability: This is how the system is able to handle increases in load without noticeable
degradation of the other two factors, latency or availability. Here, a load can mean things such as
the number of topics, consumers, producers, messages/sec, or average message size.
• Availability: In a distributed system, a variety of problems can occur at a unit level (servers,
disks, networks, and so on). The system's availability is a measure of how resilient the system is
to these failures so that it is available to end users.
• Latency: This is how much time it takes for a message to get to a consumer from a producer.
• Throughput: This is how many messages can be processed per second by the messaging system.
A classic tradeoff is between latency and throughput. To optimize throughput, we can batch messages
and process them together. But this has a very negative effect on latency.
A. Production Defect Fixing Process (Severity 3):
1. LOGGING Process: Support Coordinator TAT: Same day
i. Received issue via any communication channel (Email/Call/Text/Client
Portal/MS-Teams/Onsite discussion etc.)
ii. A support engineer will review the issue and request the support coordinator
to open call ID for the reported issue with correct severity, category, and
subject
iii. A support coordinator will assign a Call-ID to a support engineer for the L2
investigation
10. RCA of a code bug and take necessary action to overcome the bug