Professional Documents
Culture Documents
CMS MP Project - Quality Assurance Test Plan - 2009-09-16
CMS MP Project - Quality Assurance Test Plan - 2009-09-16
CMS MP Project - Quality Assurance Test Plan - 2009-09-16
Initial Approval
By CMS `
By
Joseph Mercier Date
Development Project Manager
03/23/22 ii
<Client Name> - <Project Name> Quality Assurance Plan
Revision History
Date Reason CMS Mariner
03/23/22 iii
<Client Name> - <Project Name> Quality Assurance Plan
Contents
03/23/22 iv
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan
There is also code used to administer and maintain the Managing for Performance solution.
Since the CMS MP Project’s system is not life-critical, the target defect removal rate of 95% is
considered reasonable, giving users the best balance of development time while not exposing
them to an inordinately high defect rate.
1
Courtesy of Steve McConnell, Software Development, August 1996, "Software Quality at Top Speed"
d. Completeness
e. Consistency (no contradictions)
f. Uniqueness
g. Timeliness
3. Reviewing the completed application components for conformance to approved business
and technical requirements (AKA User Acceptance Testing (UAT)).
Figure 2. below presents a standard view of testing and quality assurance related events. The
remainder of this document adapts this view to the Managing for Performance projects specific
needs.
Overview
In order to achieve the gains in development time and a 95% defect removal rate, quality must
be built into the application under construction. It cannot be "inspected" in. For this reason,
developers responsible for developing components2 must perform certain review and testing
activities that support the quality objectives for the CMS MP project. There are five activities
required during the design and coding of components for the CMS MP solution: design reviews,
code reviews, unit testing, and curing defects and retesting.
2
For the purpose of this document, the definition of component includes front-end and middle-
tier components and is extended to include database objects as well (i.e., stored procedures,
triggers, etc.)
03/23/22 11:17 AM Page 7 CMS-
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan
Figure 3. Design/Review/Code/Review/Test
Design Review
The design review should be performed before the unit test and any other design documents
are created. The Application Architect should review the documents/artifacts produced for
adherence to the design guidelines set forth in the CMS MP Project Application Architecture
Document and any other supporting documents. The Application Architect will also ensure that
the design will meet the requirements of the system.
The user interface (portal/report) developer should design the user interface collaboratively
following the user experience guidelines established for the project. The initial design can be
done on paper or a white board. Users should be involved in the review of the design(s) where
practical, and should provide feedback where possible on any initial prototypes and their
usability as they are created. More detail on usability testing is described below.
Code Review
After successful execution of the unit test, the application component (unit of code) is submitted
to the Application Architect for a formal inspection. The Application Architect and/or
representatives of CMS will insure that the developer followed the coding standards established
in CMS MP programming standards documents and style guides. They will also verify that the
code is well documented, maintainable, is relatively error free and observes appropriate coding
structures. After all review and unit testing activities are satisfactorily completed, the component
(unit of code) will be placed under change control.
Testing
Unit Testing
Developers responsible for creating components must anticipate the required unit testing. This
serves two major purposes. First, it ensures the Application Architect that the developer
understands the desired behaviors of the component that the developer is responsible for.
Second, it ensures that the expected behaviors are formally identified and documented (in this
case, coded) without bias. Without formal unit test definition, it is likely that a developer will
simply test for the behaviors that the developer created and not for the behaviors required.
Map backs to the stated requirements (use cases, use case scenarios, use case
supplements, architecture document and design documents relating to this
component)
Is sufficiently incremental to be readily tested (code to pass a small test)
Properly handles the anticipated error conditions
Considers all inputs, outputs, boundaries, algorithms, interactions, transactions &
terminations with that component (unit) of the solution
Curing Defects
The results of the unit testing should be inspected and defects should be addressed. Once
completed, the developer will rerun the unit test(s). This cycle continues until the unit testing
executes without error.
Overview
The requirements of the CMS MP application will be met through the collaboration of many
components. Developers will be assembling or integrating unit tested components into more
complex groupings that will require a higher level of testing. These complex groupings (or,
assemblies) will normally be represented by a window (or suite of windows) designed to
accomplish the goals of one or more use cases. Figure 2 demonstrates the workflow of the
system integration testing activities.
Figure 1
All development products will be subjected to a system integration test prior to deployment to
Production and user acceptance. Besides reducing the number of defects that find their way
into production, this testing activity's primary purpose is to confirm that the identified and
approved requirements are observable in the final product.
System integration testing will be applied to the four key segments of the target MP solution:
1. The Central Hub & Data Warehouse
2. The Performance Reports
3. The Performance Portals
4. Performance Planning Applications
Tests will be defined for each of these segments. Appendices B through E of this document will
contain the details of the use cases, scenarios and queries that have been or will be applied to
the components of deliverable product for each segment. These appendices will be updated /
augmented during the course of each Sprint to address that Sprint’s components of deliverable
product and any component that has been changed.
There are three activities associated with the system integration testing process: system
integration test plan creation, system integration testing, and curing defects found during system
integration testing.
A subset of scenarios3 that represents the behaviors associated with each delivered Sprint4 of
the CMS MP application will be selected. Using theses selected scenarios, a scenario-based
system integration test plan will be constructed that will trace CMS MP transactions across use
case boundaries. Much like the unit test plan, though not as granular, this plan will describe the
scenarios and the goals. It will script the activities that a user would perform to accomplish the
scenario goals. The plan will also include a description of the expected behaviors. It may also
include execution of diagnostic aids that will allow the development team to probe for proper
execution of transparent behaviors such as changes to the database.
Once a component of deliverable product for a Sprint4 is considered ready for deployment, the
development team will move the work products from the development (Sandbox) environment to
the Quality Assurance (QA) environment at CMS. The movement of these work products will be
governed by the appropriate application deployment plan, thus system testing the
implementation plan. Once deployed to the QA environment, the development team will
execute the system integration test portion of this plan appropriate for the Sprint deliverable
product(s) being tested. Defects will be recorded into the CMS MP version control and defect
tracking system (Team Foundation Server). Defects that prevent the development team from
continuing through system test will be cured. The system test will then be re-executed.
Otherwise, the development team will attempt to work through the entire test. Upon completion,
defects will be cured. This cycle will continue until all defects are removed. Serious defects that
involve a significant impact (to be defined) must be presented to the CMS MP Change Control
Board for approval prior to resolution.
Cure Defects
For each defect identified, individual defect reports will be recorded in the CMS MP Defect
Tracking System and assigned to a developer for correction. After all defects are cured, they
3
See CMS Use Case Analysis – Use Case Scenarios document (if available).
4
Sprint is defined as any grouping of executable development products that will be placed into
production for the CMS users.
03/23/22 11:17 AM Page 10 CMS-
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan
will be recorded as such in the defect tracking system. The segment of the system integration
test plan related to that component will be re-executed until no defects are detected.
The project manager (or his designee) assigns defects to development team members for
correction. The developer will have access to all quality assurance documents that identify the
nature of the defect, the time to cure estimates and priorities. The developer will make
corrections required seeking guidance if necessary. When completed, the developer will update
the CMS MP version control and defect tracking system (Team Foundation Server)
appropriately.
Quality Assurance
Figure 3
There are three primary quality assurance activities associated with the user acceptance
process: user acceptance test plan creation, user acceptance testing, and curing defects found
during user acceptance testing.
selected scenarios, they will construct a scenario-based acceptance test plan that will trace
CMS MP transactions across use case boundaries.
Curing Defects
The project manager (or his designate) will assign defects to development team members for
correction. The developer will have access to all quality assurance documents that identify the
nature of the defect, the time to cure estimates and priorities. The developer will make
corrections required seeking guidance if necessary. When completed, the developer will update
the CMS MP Defect Tracking System appropriately.
Defect Tracking
Defect tracking will begin during system integration testing. Defects will be tracked using
Microsoft Excel spreadsheets or Microsoft’s Team Foundation Server software. The following
information will be captured:
Defects will be entered as they are discovered. Required entries will be the description, date
detected, detected by, phase created, phase detected and the severity level. The owner is
considered the person responsible for curing the defect. The owner has the ability to delegate
the task to another.
Additionally, the owner must record the severity level, the phase created and the estimated
hours to correct the defect. After a defect is considered cured by the owner, he must record the
date completed, completed by and hours to correct for the defect. Once a completed date is
entered, the defect is considered corrected.
Application Area: ___ <Subject Area 1> ___ <Subject Area 5>
___ <Subject Area 2> ___ <Subject Area 6>
___ <Subject Area 3> ___ <Subject Area 7>
___ <Subject Area 4> ___ <Subject Area 8>
Defect or ___ Defect (error found in existing application look & feel or processing)
Enhancement?: ___ Enhancement (request for new feature or functionality)
Comments: _________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
Summary Using SQL 2008 SSIS, ETL will connect to Oracle source for ESIS, pull
data from
ADMASSIST.STUDENTS and load to SQL 2008 Data warehouse table
K12DW.DW.DimStudents.
Actors ETL, SQL 2008 Database K12DW, Oracle source for ESIS
Preconditions/Assumptions:
1. Connectivity to Oracle
2. ADMASSIST.Students has data
3. Connectivity to SQL 2008 CMS MP databases
4. Managing for Performance databases and tables have been created
a. K12CentralHub.Hub.Students
b. K12Import.Import.DimStudents
c. K12DWStage.DW.DimStudentsStage
d. K12DW.DW.DimStudents
e. Pamlico database fully created and populated
f. K12CentralHub.Annex fully created and annex tables loaded
5. ValueMappingHandleCode variable set to 0 so that if records don’t
match
To a valid value mapping they will still be processed.
Triggers
1. Control package for Hub kicks off package CMS_Load_Hub_Students
Alternative paths
1. Data errors to Pamlico database table
Post conditions
1. Hub SSIS CMS_Hub_ETL_Load_Students completes successfully.
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
Business Rules:
1. StudentSISID = ESIS Student pupil_number
2. If last name changes, old record present and new record with new
last name exists.
3. If demographic changes, old record present and new record with
change exists.
4. If TownofResidence changes, old record present and new record
exists.
5. If NativeClanTribeName changes, old and new records exist.
6. If NativeClanTribeNumber changes, old and new records exist.
7. If Immigrant status changes, old and new records exist.
8. If language code changes, old and new records exist.
9. If language type code changes, old and new records exist.
10. If marital status changes, old and new records exist.
11. If number of dependents change, old and new records exist.
12. If Migratory status changes, old and new records exist.
13. ETL metadata fields should be populated appropriately.
----------------------------
--Preconditions/Assumptions:
-- Assumptions
--1. Connectivity to Oracle
--2. ADMASSIST.Students has data - SOURCE IMPORT QUERY for ORACLE ESIS
-----------------------------------------------------------------------
--2. Connectivity to SQL 2008 CMS MP databases
--4. Managing for Performance databases and tables have been created
--a. K12CentralHub.Hub.Students
Use K12CentralHub
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
go
IF OBJECT_ID('[Hub].[Students]') IS NOT NULL
PRINT '<<< TABLE [Hub].[Students] exists >>>'
ELSE
PRINT '<<< TABLE [Hub].[Students] does not exist >>>'
END
--b. K12Import.Import.DimStudents
Use K12DWImport
go
IF OBJECT_ID('[Import].[DimStudents]') IS NOT NULL
PRINT '<<< TABLE [Import].[DimStudents] exists >>>'
ELSE
PRINT '<<< TABLE [Import].[DimStudents] does not exist >>>'
END
--c. K12DWStage.DW.DimStudentsStage
Use K12DWStage
go
IF OBJECT_ID('[DW].[DimStudentsStage]') IS NOT NULL
PRINT '<<< TABLE [DW].[DimStudentsStage] exists >>>'
ELSE
PRINT '<<< TABLE [DW].[DimStudentsStage] does not exist >>>'
END
--d. K12DW.DW.DimStudents
Use K12DW
go
IF OBJECT_ID('[DW].[DimStudents]') IS NOT NULL
PRINT '<<< TABLE [DW].[DimStudents] exists >>>'
ELSE
PRINT '<<< TABLE [DW].[DimStudents] does not exist >>>'
END
-----------------------------------------------------------------------
--e. Pamlico database fully created and populated
Use Pamlico
go
IF OBJECT_ID('[ETLAudit].[DataErrorSeverity]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[DataErrorSeverity] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[DataErrorSeverity] does not exist
>>>'
END
IF OBJECT_ID('[ETLAudit].[ETLContainerExecution]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLContainerExecution] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLContainerExecution] does not
exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLControl]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLControl] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLControl] does not exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLDataError]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLDataError] exists >>>'
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLDataError] does not exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLPackageExecution]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLPackageExecution] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLPackageExecution] does not exist
>>>'
END
--f. K12CentralHub.Annex fully created and annex tables loaded
--5. ValueMappingHandleCode variable set to 0 so that if records don’t
match To a valid value mapping they will still be processed.
select * from K12DWStage.ETL.SSISConfigurations where
configurationFilter = 'ValueMappingErrorHandleFlag'
-----------------------------------------------------------------------
--Triggers
--1. Control package for Hub kicks off package CMS_Load_Hub_Students
-----------------------------------------------------------------------
--Basic Course of Events:
--1. Hub SSIS CMS_Hub_ETL_Load_Students runs to load data to
K12CentralHub.Hub.Students
--2. Import SSIS CMS_PerformanceDw_ETL_Import_DimStudents runs to load
data to K12Import.Import.DimStudents
--3. Stage SSIS CMS_PerformanceDw_ETL_Transform_DimStudents runs to
load data to K12DWStage.DW.DimStudentsStage
--4. DW SSIS CMS_PerformanceDw_ETL_Load_DimStudents runs to load data
to K12DW.DW.DimStudents
-----------------------------------------------------------------------
--Alternative paths
--1. Data errors to Pamlico database table
select * from Pamlico.ETLAudit.ETLDataError where TaskName in
('SEQ Extract Students','DFL Import DimStudent','DFL Transform
DimStudent','DFL Load DimStudent')
-----------------------------------------------------------------------
--Post conditions
--1. Hub SSIS CMS_Hub_ETL_Load_Students completes successfully.
--2. Data is loaded into K12CentralHub.Hub.Students
Select count(1) from K12CentralHub.Hub.Students -- 238883
,M.WITHDRAW_DATE
,M.REGISTERED_INDICATOR
,M.GRADE
,M.EMAIL_ADDRESS
,M.PHONE
,(SELECT MUNICIPALITY_DESC FROM HUB.ESIS_MUNICIPALITIES WHERE
MUNICIPALITY = M.MUNICIPALITY) AS LocationCity
,M.BIRTH_DATE
,M.STREET_NAME
,M.STREET_NUMBER
,M.APARTMENT
,M.POSTAL_CODE
,M.PROVINCE
,M.MAILING_ADDRESS
,M.SCH_ENTRY_DATE_AT_COUNTRY
,M.CREATE_DATE
,M.RELEASE_OF_INFO
,M.INDIAN_ANCESTRY_CODE
,(SELECT TRIBAL_DESC FROM HUB.ESIS_TRIBAL_CODES WHERE TRIBAL_CODE =
INDIAN_ANCESTRY_CODE) AS NativeClanTribeName
,(SELECT TOP 1 MIGRANT_WORKER FROM Hub.PARENTS AS PS
WHERE PS.PUPIL_NUMBER = M.PUPIL_NUMBER
AND MIGRANT_WORKER = N'Y') AS MIGRANT_WORKER
,M.COUNTRY_OF_CITIZENSHIP
,M.COUNTRY_OF_BIRTH
,M.LANGUAGE_CODE_MOST_USED
,M.FIRST_LANGUAGE_CODE
,M.LANGUAGE_CODE
,(SELECT LANGUAGE_DESC FROM Hub.LANGUAGE_CODES WHERE LANGUAGE_CODE =
M.LANGUAGE_CODE) AS LANGUAGE_DESC
,0 AS NumberofDependents
,(SELECT TOP 1 Calendar_Date FROM HUB.CMS_Calendar
WHERE School_Day = 180
AND School_Year = (SELECT CR_SCHOOL_YEAR / 10000 + 10000 *
(CR_SCHOOL_YEAR / 10000 -1)
FROM HUB.CR_YEAR_PARAMS)) AS LastDay
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
('CMS_Hub_ETL_Load_Students','CMS_PerformanceDW_ETL_Import_DimStudent','
CMS_PerformanceDW_ETL_Transform_DimStudent','CMS_PerformanceDW_ETL_Load_
DimStudent')
--11. Select source data for 5 students and compare to target data for 5
students.
select * from k12dw.dw.dimstudent where StudentSISID in
('8588913','8299579','8239395','8172849','8647934')
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
Goal: Display EOG assessment scores and growth values for all class periods
taught in the assessed course by a given teacher for the given grade level.
Summary: Using SQL 2008 tables housed in the Data Warehouse K12DW, and
accessed exclusively through stored procedures.
Preconditions/Assumptions:
1. Connectivity to K12DW server;
2. The stored procedures are current and available;
3. Connectivity to SQL 2008 CMS MP databases;
4. Managing for Performance databases and tables are populated.
Alternative Paths: Should there be no data returned, the report will display a message
indicating that it found no records that match the selected criteria.
Business Rules: The user may view only his or her own data.
USE [K12DW]
GO
USE [K12DW]
GO
SET QUOTED_IDENTIFIER ON
GO
/***********************************************************
Purpose: returns class counts by period and course along
with average expected scale score, average EOG scale score.
Change Log:
Date Developer Change Description
20090507 Duane Simes Initial Version
***********************************************************/
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
AS
declare @tests table
(
SchoolID int,
StaffID int,
TestID int,
DateID int
)
insert into @tests
select distinct
EnrolledSchoolID,
StaffID,
AssessmentID,
DateID
from DW.FactStudentAssessment fsa
where EnrolledSchoolID = @SchoolID
and StaffID = @StaffID
and AssessmentID = @AssessmentID
and exists
(
select null
from DW.DimDate dd
where fsa.DateID = dd.DateID
and AcademicYear = @AcademicYear
)
select
CourseID,
Period,
CourseTitle,
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
Section,
StudentCount,
avg([ExpectedScore]/StudentCount) as [ExpectedScore],
avg([ActualScore]/StudentCount) as [ActualScore],
avg([ExpectedGrowth]/StudentCount) as [ExpectedGrowth],
avg([ActualGrowth]/StudentCount) as [ActualGrowth],
GrowthCount
FROM
(
SELECT
Period,
CourseID,
CourseTitle,
NULL AS SECTION,
COUNT( DISTINCT StudentID) AS StudentCount,
SUM(ExpectedScaleScore) AS [ExpectedScore],
SUM(ScaleScore) AS [ActualScore] ,
SUM(ExpectedGrowth) AS [ExpectedGrowth],
SUM(Growth) AS [ActualGrowth],
SUM (CASE
WHEN growth > 0 AND ScenarioCode = ''G''
THEN 1
ELSE 0 END) AS GrowthCount
FROM
(
SELECT
cl.ClassPeriod AS Period,
co.CourseID,
co.CourseTitle,
sa.AssessmentID,
sa.StudentID,
DimScenario.ScenarioCode,
CASE
WHEN DimScenario.ScenarioCode = ''ES''
THEN sa.ScaleScore
ELSE 0.00
END AS ExpectedScaleScore
,CASE
WHEN DimScenario.ScenarioCode = ''R'' THEN
sa.ScaleScore
ELSE 0.00
END AS ScaleScore
,CASE
WHEN DimScenario.ScenarioCode = ''EG''
THEN sa.ScaleScore
ELSE 0.00
END AS ExpectedGrowth
,CASE
WHEN DimScenario.ScenarioCode = ''G'' THEN
sa.ScaleScore
ELSE 0.00
END AS Growth
FROM
@fact sa
JOIN DW.DimScenario
ON sa.ScenarioID =
DW.DimScenario.ScenarioID
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan
JOIN DW.DimClass cl
ON sa.ClassID = cl.ClassID
JOIN DW.DimStudent
ON sa.StudentID =
DW.DimStudent.StudentID
JOIN DW.DimCourse co
ON sa.CourseID = co.CourseID
) AS Driver
GROUP BY
CourseID,
Period,
CourseTitle
) AS eog
GROUP BY
CourseID,
Period,
CourseTitle,
Section,
StudentCount,
GrowthCount
ORDER BY CourseID, Period
'
END
GO
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan