CMS MP Project - Quality Assurance Test Plan - 2009-09-16

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 29

Charlotte-Mecklenburg Schools (CMS)

Managing for Performance Project


Quality Assurance & Test Plan

Distribution: Jay Parker, Michael Davis,


Susan Johnson, Robert
Avossa, Ann Clark
Prepared By: Joseph Mercier, Patricia Hale,
Derek Sanderson, Gary Miller
Status: Draft
Initial Release Date:
Latest Revision Date:

©2022 , ALL RIGHTS RESERVED.


CONTENTS OF THIS DOCUMENT ARE PROPRIETARY AND MAY NOT BE DISTRIBUTED WITHOUT
THE PRIOR WRITTEN CONSENT OF MARINER AND <CLIENT NAME>.
<Client Name> - <Project Name> Quality Assurance Plan

Initial Approval

By CMS `

Robert Avossa Date


Chief Accountability Officer

Ann Clark Date


Chief Academic Officer

Susan Johnson Date


Chief Information Officer

By
Joseph Mercier Date
Development Project Manager

03/23/22 ii
<Client Name> - <Project Name> Quality Assurance Plan

Revision History
Date Reason CMS Mariner

03/23/22 iii
<Client Name> - <Project Name> Quality Assurance Plan

Contents

PURPOSE OF THIS DOCUMENT...............................................................................................5

OVERVIEW OF THE MANAGING FOR PERFORMANCE SOLUTION...........................5

GOALS OF QUALITY ASSURANCE & TEST PLANNING..................................................5

QUALITY ASSURANCE & TEST PRACTICES.....................................................................6


QUALITY ASSURANCE & TEST PRACTICES OVERVIEW.............................................................6
FIGURE 2. STANDARD TEST MODEL............................................................................................7
REVIEWS AND TESTING................................................................................................................7
Overview................................................................................................................................7
Design and Code Reviews..................................................................................................8
Testing....................................................................................................................................8
Unit Testing............................................................................................................................8
Curing Defects.......................................................................................................................9
SYSTEM INTEGRATION TESTING..................................................................................................9
Overview................................................................................................................................9
System Integration Test Plan Creation............................................................................10
System Integration Testing................................................................................................10
Cure Defects........................................................................................................................10
QUALITY ASSURANCE.................................................................................................................11
User Interface (Portal/Report) Final Review...................................................................11
USER ACCEPTANCE TESTING....................................................................................................11
User Acceptance Test Plan Creation...............................................................................11
User Acceptance Test Plan Execution.............................................................................12
Curing Defects.....................................................................................................................12
DEFECT TRACKING.....................................................................................................................12
APPENDIX A – DEFECT REPORT FORM....................................................................................14
APPENDIX B – USE CASE, SCENARIOS & QUERIES FOR ETL SYSTEM INTEGRATION
TESTING......................................................................................................................................15
APPENDIX C – USE CASE, SCENARIOS & QUERIES FOR REPORTS SYSTEM INTEGRATION
TESTING......................................................................................................................................16
APPENDIX D – USE CASE, SCENARIOS & QUERIES FOR PORTALS SYSTEM INTEGRATION
TESTING......................................................................................................................................17
Appendix E – Use Case, Scenarios & Queries for Performance Planning Application
Systems Integration Testing.................................................................................................18

03/23/22 iv
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

Purpose of this Document


The purpose of this document is to define the level of quality required of Charlotte-Mecklenburg
School’s (CMS’s) Managing for Performance (MP) Project and outline a quality assurance and
test plan that will insure that all deliverables meet that quality standard.

Overview of the Managing for Performance Solution


The project solution decomposes into four key segments:
1. The Central Hub & Data Warehouse
2. The Performance Reports
3. The Performance Portals
4. Performance Planning Applications
For quality assurance and testing purposes, the Central Hub and Data Warehouse is comprised
of: data table structures (dimensions, facts); application code that imports/extracts, transforms,
and loads data into the Central Hub and Data Warehouse tables, and the data that populate the
tables. The Performance Reports are comprised of code the accesses data from the Data
Warehouse and aggregates and formats the data for specific display to end users. The
Performance Portals are comprised of code that controls access to and organizes Performance
Reports, other components of information, and links to other sources/locations of information to
facilitate end user access and comprehension. The Performance Planning Applications are
comprised of code that aggregates and displays performance analytical information (KPI’s,
metrics) for evaluation and planning at the student, teacher, school and district levels.

There is also code used to administer and maintain the Managing for Performance solution.

Goals of Quality Assurance & Test Planning


Research suggests that, as a general rule, projects with low defect rates also have the shortest
schedules (Jones 1991). After surveying approximately 4,000 software projects, Capers Jones
reported that poor quality was one of the most common reasons for schedule overruns. He also
reported that poor quality is implicated in close to half of all canceled projects. A survey
performed by SEI (Software Engineering Institute) found that more than 60 percent of
organizations assessed suffered from inadequate quality assurance.
When a software product has too many defects, developers spend more time fixing problems
than they spend writing it in the first place. If an adequate amount of time is spent making sure
that we do our work correctly the first time, not only will we achieve the shortest development
schedule, we will also provide our users with a product with a low-defect rate minimizing
operational disruptions.
As shown in the chart below, a 95% defect removal rate achieves the greatest improvement in
development time. Diminishing returns begin to occur as system requirements dictate a greater
than 95% defect removal rate. These types of defect removal rates are appropriate for life-
critical systems such as heart monitors or life-support systems.

03/23/22 11:17 AM Page 5 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

Figure 1. Defect Rate vs. Development Time

Since the CMS MP Project’s system is not life-critical, the target defect removal rate of 95% is
considered reasonable, giving users the best balance of development time while not exposing
them to an inordinately high defect rate.

Quality Assurance & Test Practices


Quality Assurance & Test Practices Overview
The purpose of testing and quality events is to ensure that the product being created most fully
meets the requirements and expectations of the stakeholders and is readily supportable over
time.

Testing focuses on defect-driven events in order to:


1. Validate that the code:
a. Meets requirements
b. Doesn’t break other functionality
c. Handles exception conditions
2. Produce verifiable results
3. Provide for regression testing (to verify changes didn’t affect other tested code)
4. Verify system, database, and user-oriented performance
5. Support the smooth transition/migration from a preceding environment to the succeeding
environment.
Quality assurance focuses on conformance/compliance events by:
1. Reviewing code for:
a. Compliance with guidelines and standards
b. Optimization of the coded function(s)
c. Effectiveness in delivering the coded function(s)
2. Reviewing data for:
a. Granularity
b. Accuracy
c. Clarity

1
Courtesy of Steve McConnell, Software Development, August 1996, "Software Quality at Top Speed"

03/23/22 11:17 AM Page 6 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

d. Completeness
e. Consistency (no contradictions)
f. Uniqueness
g. Timeliness
3. Reviewing the completed application components for conformance to approved business
and technical requirements (AKA User Acceptance Testing (UAT)).
Figure 2. below presents a standard view of testing and quality assurance related events. The
remainder of this document adapts this view to the Managing for Performance projects specific
needs.

Figure 2. Standard Test Model

Reviews and Testing

Overview
In order to achieve the gains in development time and a 95% defect removal rate, quality must
be built into the application under construction. It cannot be "inspected" in. For this reason,
developers responsible for developing components2 must perform certain review and testing
activities that support the quality objectives for the CMS MP project. There are five activities
required during the design and coding of components for the CMS MP solution: design reviews,
code reviews, unit testing, and curing defects and retesting.

2
For the purpose of this document, the definition of component includes front-end and middle-
tier components and is extended to include database objects as well (i.e., stored procedures,
triggers, etc.)
03/23/22 11:17 AM Page 7 CMS-
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

Figure 3. Design/Review/Code/Review/Test

Design and Code Reviews


Design and code reviews will be performed to ensure that quality is built into the application.
Design reviews are essential to ensuring that the proper solution is created to elegantly meet
the requirements. Code reviews ensure the quality of the source code, making maintenance
easier. Both reviews include a verification that design and code conform with guidelines and
standards.

Design Review
The design review should be performed before the unit test and any other design documents
are created. The Application Architect should review the documents/artifacts produced for
adherence to the design guidelines set forth in the CMS MP Project Application Architecture
Document and any other supporting documents. The Application Architect will also ensure that
the design will meet the requirements of the system.
The user interface (portal/report) developer should design the user interface collaboratively
following the user experience guidelines established for the project. The initial design can be
done on paper or a white board. Users should be involved in the review of the design(s) where
practical, and should provide feedback where possible on any initial prototypes and their
usability as they are created. More detail on usability testing is described below.

Code Review
After successful execution of the unit test, the application component (unit of code) is submitted
to the Application Architect for a formal inspection. The Application Architect and/or
representatives of CMS will insure that the developer followed the coding standards established
in CMS MP programming standards documents and style guides. They will also verify that the
code is well documented, maintainable, is relatively error free and observes appropriate coding
structures. After all review and unit testing activities are satisfactorily completed, the component
(unit of code) will be placed under change control.

Testing

Unit Testing
Developers responsible for creating components must anticipate the required unit testing. This
serves two major purposes. First, it ensures the Application Architect that the developer
understands the desired behaviors of the component that the developer is responsible for.
Second, it ensures that the expected behaviors are formally identified and documented (in this
case, coded) without bias. Without formal unit test definition, it is likely that a developer will
simply test for the behaviors that the developer created and not for the behaviors required.

Unit testing should demonstrate that the unit of code:

03/23/22 11:17 AM Page 8 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

 Map backs to the stated requirements (use cases, use case scenarios, use case
supplements, architecture document and design documents relating to this
component)
 Is sufficiently incremental to be readily tested (code to pass a small test)
 Properly handles the anticipated error conditions
 Considers all inputs, outputs, boundaries, algorithms, interactions, transactions &
terminations with that component (unit) of the solution

Curing Defects
The results of the unit testing should be inspected and defects should be addressed. Once
completed, the developer will rerun the unit test(s). This cycle continues until the unit testing
executes without error.

System Integration Testing

Overview
The requirements of the CMS MP application will be met through the collaboration of many
components. Developers will be assembling or integrating unit tested components into more
complex groupings that will require a higher level of testing. These complex groupings (or,
assemblies) will normally be represented by a window (or suite of windows) designed to
accomplish the goals of one or more use cases. Figure 2 demonstrates the workflow of the
system integration testing activities.

Figure 1

All development products will be subjected to a system integration test prior to deployment to
Production and user acceptance. Besides reducing the number of defects that find their way
into production, this testing activity's primary purpose is to confirm that the identified and
approved requirements are observable in the final product.

System integration testing will be applied to the four key segments of the target MP solution:
1. The Central Hub & Data Warehouse
2. The Performance Reports
3. The Performance Portals
4. Performance Planning Applications
Tests will be defined for each of these segments. Appendices B through E of this document will
contain the details of the use cases, scenarios and queries that have been or will be applied to
the components of deliverable product for each segment. These appendices will be updated /
augmented during the course of each Sprint to address that Sprint’s components of deliverable
product and any component that has been changed.

There are three activities associated with the system integration testing process: system
integration test plan creation, system integration testing, and curing defects found during system
integration testing.

03/23/22 11:17 AM Page 9 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

System Integration Test Plan Creation


Developers responsible for assembling components of deliverable product into more complex
integrated assemblies during each Sprint of the MP project must first create a system
integration test plan. This plan will identify the nature of the collaboration and a description of
the observable and transparent behaviors. Creating the plan during component development
also serves to insure that the developer understands the objectives during construction. The
system integration test plan must be submitted to the application architect or project manager
prior to the components of deliverable product for a Sprint being deployed to the QA
environment.

A subset of scenarios3 that represents the behaviors associated with each delivered Sprint4 of
the CMS MP application will be selected. Using theses selected scenarios, a scenario-based
system integration test plan will be constructed that will trace CMS MP transactions across use
case boundaries. Much like the unit test plan, though not as granular, this plan will describe the
scenarios and the goals. It will script the activities that a user would perform to accomplish the
scenario goals. The plan will also include a description of the expected behaviors. It may also
include execution of diagnostic aids that will allow the development team to probe for proper
execution of transparent behaviors such as changes to the database.

System Integration Testing


After the development of the components of deliverable product for a Sprint is completed and
the components have been deployed to the QA environment, each segment of the system
integration test plan must be executed, preferably by team member other than a component’s
developer. The tester must sign off on each observed behavior, noting any aberrant behaviors
identified. Any component of deliverable product exhibiting aberrant behavior(s) will be returned
to the original developer for modification. Defects will be recorded into the defect tracking
application.

Once a component of deliverable product for a Sprint4 is considered ready for deployment, the
development team will move the work products from the development (Sandbox) environment to
the Quality Assurance (QA) environment at CMS. The movement of these work products will be
governed by the appropriate application deployment plan, thus system testing the
implementation plan. Once deployed to the QA environment, the development team will
execute the system integration test portion of this plan appropriate for the Sprint deliverable
product(s) being tested. Defects will be recorded into the CMS MP version control and defect
tracking system (Team Foundation Server). Defects that prevent the development team from
continuing through system test will be cured. The system test will then be re-executed.
Otherwise, the development team will attempt to work through the entire test. Upon completion,
defects will be cured. This cycle will continue until all defects are removed. Serious defects that
involve a significant impact (to be defined) must be presented to the CMS MP Change Control
Board for approval prior to resolution.

Cure Defects
For each defect identified, individual defect reports will be recorded in the CMS MP Defect
Tracking System and assigned to a developer for correction. After all defects are cured, they

3
See CMS Use Case Analysis – Use Case Scenarios document (if available).
4
Sprint is defined as any grouping of executable development products that will be placed into
production for the CMS users.
03/23/22 11:17 AM Page 10 CMS-
Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

will be recorded as such in the defect tracking system. The segment of the system integration
test plan related to that component will be re-executed until no defects are detected.

The project manager (or his designee) assigns defects to development team members for
correction. The developer will have access to all quality assurance documents that identify the
nature of the defect, the time to cure estimates and priorities. The developer will make
corrections required seeking guidance if necessary. When completed, the developer will update
the CMS MP version control and defect tracking system (Team Foundation Server)
appropriately.

Quality Assurance

User Interface (Portal/Report) Final Review


Once the interface has been implemented and unit and system integration tested, the user
interface developer should run the application in the test environment. The user interface
developer should confirm the application behaves as designed. Screenshots of all suspicious
aspects of the user interface should be made. The user interface developer should use these to
convey user interface issues and instructions for curing them. The screenshots can be
incorporated into a document, as necessary, and commented or printed for hand written
comments.

User Acceptance Testing


After the components of deliverable product for a Sprint of the CMS MP system are system
tested, it will be submitted to the users for user acceptance testing. Like the system integration
test, it will be scenario-based; it will also validate that each key approved requirement has been
satisfied by a feature/function within the deliverable product(s). The primary difference is that
the users will define the scenarios and prepare the plan. This will be completed while the Sprint
deliverable products are under construction. Upon completion of system integration testing and
deployment to Production, the users will then be asked to execute their plan.

Figure 3

There are three primary quality assurance activities associated with the user acceptance
process: user acceptance test plan creation, user acceptance testing, and curing defects found
during user acceptance testing.

User Acceptance Test Plan Creation


The user acceptance test plan is very similar to the system integration test plan, though it is
much more focused on observable behaviors. It also validates that key approved requirements
are specifically satisfied by a feature function within the delivered product. Like the system
integration test plan, a subset of scenarios representing the behaviors associated with the
components of deliverable product of the CMS MP application delivered at the end of each
Sprint will be selected by the end users who will conduct acceptance testing. Using these

03/23/22 11:17 AM Page 11 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

selected scenarios, they will construct a scenario-based acceptance test plan that will trace
CMS MP transactions across use case boundaries.

User Acceptance Test Plan Execution


The development team will deploy the candidate application to the Production environment
following successful system integration testing in the QA environment. The end user
acceptance test team will then execute their acceptance test plan. Defects will be recorded into
the CMS MP Defect Tracking System. Defects that prevent the user acceptance test team from
continuing through acceptance testing will be cured. The acceptance test will then be re-
executed. Otherwise, the user acceptance test team will attempt to work through the entire test.
Upon completion, defects will be cured. This cycle will continue until all defects are removed.
Serious defects that involve a significant impact (to be defined) must be presented to the CMS
MP Change Control Board prior to resolution.

Curing Defects
The project manager (or his designate) will assign defects to development team members for
correction. The developer will have access to all quality assurance documents that identify the
nature of the defect, the time to cure estimates and priorities. The developer will make
corrections required seeking guidance if necessary. When completed, the developer will update
the CMS MP Defect Tracking System appropriately.

Defect Tracking
Defect tracking will begin during system integration testing. Defects will be tracked using
Microsoft Excel spreadsheets or Microsoft’s Team Foundation Server software. The following
information will be captured:

Data Element Description


Defect ID A unique identifier for each defect. Sequentially assigned
beginning with 1
Defect description A brief description of the defect
Date detected Date the defect was first identified
Phase detected During what phase was the defect identified? Analysis,
Design, Construction, System test, User Acceptance Test,
Transition, Post Delivery.
Phase created During what phase was the defect introduced into the
application? Analysis, Design, Construction, System test,
User Acceptance Test, Transition, Post Delivery.
Severity Level 1 - 4, where 1 = Cosmetic and 4 = Critical
Detected by The person who identified the defect
Estimated hours to cure The estimated number of developer hours required curing the
defect.
Notes Any pertinent information about the defect that should be
recorded and shared
Assigned to The developer responsible for curing the defect
Date corrected The date the defect is cured.
Corrected by The developer responsible for curing the defect
Hours to correct The actual number of developer hours required to cure the
defect

Defects will be entered as they are discovered. Required entries will be the description, date
detected, detected by, phase created, phase detected and the severity level. The owner is

03/23/22 11:17 AM Page 12 CMS-


Charlotte-Mecklenburg Schools (CMS) – Managing for Performance (MP) Project Quality Assurance Plan

considered the person responsible for curing the defect. The owner has the ability to delegate
the task to another.
Additionally, the owner must record the severity level, the phase created and the estimated
hours to correct the defect. After a defect is considered cured by the owner, he must record the
date completed, completed by and hours to correct for the defect. Once a completed date is
entered, the defect is considered corrected.

03/23/22 11:17 AM Page 13 CMS-


Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Appendix A – Defect Report Form

CMS MP Defect / Enhancement Report Form


Brief Description: _________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
____________________________

Application Area: ___ <Subject Area 1> ___ <Subject Area 5>
___ <Subject Area 2> ___ <Subject Area 6>
___ <Subject Area 3> ___ <Subject Area 7>
___ <Subject Area 4> ___ <Subject Area 8>

Reference #: _____________________ (for internal use)

Date Reported: _____________________

Reported By: _____________________

Defect or ___ Defect (error found in existing application look & feel or processing)
Enhancement?: ___ Enhancement (request for new feature or functionality)

Importance: ___ Severe (causing information to be lost, unable to use application)


___ Major (no data loss, but issue needs to be addressed quickly)
___ Minor (able to work around the problem, can wait reasonable time for fix)
___ Cosmetic (easy to work around problem, can be fixed in the future)

Comments: _________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
_________________________________________________________
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Appendix B – Use Case, Scenarios & Queries for ETL System


Integration Testing
Use CaseID Use Case Name Primary Actor Scope Complexity Priority
1 Load DimAssessment ETL in High 1
2 Load DimAssessmentAdministered ETL in High 2
3 Load DimAssessmentLevel ETL in High 3
4 Load DimAttendanceInfo ETL in High 4
5 Load DimClass ETL in High 5
6 Load DimCommonProgram ETL in High 6
7 Load DimCourse ETL in High 7
8 Load DimDate ETL in High 8
9 Load DimDemographic ETL in High 9
10 Load DimEntryExitInfo ETL in High 10
11 Load DimIncident ETL in High 11
12 Load DimProgram ETL in High 12
13 Load DimRelatedAssessment ETL in High 13
14 Load DimScenario ETL in High 14
15 Load DimSchool ETL in High 15
16 Load DimServiceSetting ETL in High 16
17 Load DimSnapshotInfo ETL in High 17
18 Load DimStaff ETL in High 18
19 Load DimStudent ETL in High 19
20 Load FactAssessmentCourseMap ETL in High 20
21 Load FactSchoolEnrollment ETL in High 21
22 Load FactStudentAssessment ETL in High 22
23 Load FactStudentClassAssessment ETL in High 23
24 Load FactStudentClassEnrollment ETL in High 24
25 Load FactStudentCourseAssessment ETL in High 25
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Use Case Name: #19 Load DimStudents


Version: 1
Date: 9/11/2009
Author: Pattie Hale

Goal: Load ESIS data for ADMASSIST.Students to K12DW.DW.DimStudents

Summary Using SQL 2008 SSIS, ETL will connect to Oracle source for ESIS, pull
data from
ADMASSIST.STUDENTS and load to SQL 2008 Data warehouse table
K12DW.DW.DimStudents.

Actors ETL, SQL 2008 Database K12DW, Oracle source for ESIS

Preconditions/Assumptions:
1. Connectivity to Oracle
2. ADMASSIST.Students has data
3. Connectivity to SQL 2008 CMS MP databases
4. Managing for Performance databases and tables have been created
a. K12CentralHub.Hub.Students
b. K12Import.Import.DimStudents
c. K12DWStage.DW.DimStudentsStage
d. K12DW.DW.DimStudents
e. Pamlico database fully created and populated
f. K12CentralHub.Annex fully created and annex tables loaded
5. ValueMappingHandleCode variable set to 0 so that if records don’t
match
To a valid value mapping they will still be processed.

Triggers
1. Control package for Hub kicks off package CMS_Load_Hub_Students

Basic Course of Events:


1. Hub SSIS CMS_Hub_ETL_Load_Students runs to load data to
K12CentralHub.Hub.Students
2. Import SSIS CMS_PerformanceDw_ETL_Import_DimStudents runs
to load data to K12Import.Import.DimStudents
3. Stage SSIS CMS_PerformanceDw_ETL_Transform_DimStudents runs
to load data to K12DWStage.DW.DimStudentsStage
4. DW SSIS CMS_PerformanceDw_ETL_Load_DimStudents runs to load
data to K12DW.DW.DimStudents

Alternative paths
1. Data errors to Pamlico database table

Post conditions
1. Hub SSIS CMS_Hub_ETL_Load_Students completes successfully.
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

2. Data is loaded into K12CentralHub.Hub.Students


3. Import SSIS CMS_PerformanceDw_ETL_Import_DimStudents
completes successfully.
4. Data is loaded in K12DWImport.Import.DimStudents
5. Transform SSIS
CMS_PerformanceDw_ETL_Transform_DimStudents completes
successfully
6. Data is loaded in K12DWStage.DW.DimStudentsStage
7. Load SSIS CMS_PerformanceDw_ETL_Load_DimStudents
completes successfully.
8. Data is loaded in K12DW.DW.DimStudents
9. ETL logging records in Pamlico database
10. Source table record count should equal or approximately equal
Target table record count.
11. Select source data for 5 students and compare to target data for 5
students.

Business Rules:
1. StudentSISID = ESIS Student pupil_number
2. If last name changes, old record present and new record with new
last name exists.
3. If demographic changes, old record present and new record with
change exists.
4. If TownofResidence changes, old record present and new record
exists.
5. If NativeClanTribeName changes, old and new records exist.
6. If NativeClanTribeNumber changes, old and new records exist.
7. If Immigrant status changes, old and new records exist.
8. If language code changes, old and new records exist.
9. If language type code changes, old and new records exist.
10. If marital status changes, old and new records exist.
11. If number of dependents change, old and new records exist.
12. If Migratory status changes, old and new records exist.
13. ETL metadata fields should be populated appropriately.

SQL Examples for DimStudent load:

----------------------------
--Preconditions/Assumptions:
-- Assumptions
--1. Connectivity to Oracle
--2. ADMASSIST.Students has data - SOURCE IMPORT QUERY for ORACLE ESIS

select count(1) from ADMASSIST.STUDENTS -- 238903 (on 9/15/2009

-----------------------------------------------------------------------
--2. Connectivity to SQL 2008 CMS MP databases
--4. Managing for Performance databases and tables have been created
--a. K12CentralHub.Hub.Students
Use K12CentralHub
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

go
IF OBJECT_ID('[Hub].[Students]') IS NOT NULL
PRINT '<<< TABLE [Hub].[Students] exists >>>'
ELSE
PRINT '<<< TABLE [Hub].[Students] does not exist >>>'
END

--b. K12Import.Import.DimStudents
Use K12DWImport
go
IF OBJECT_ID('[Import].[DimStudents]') IS NOT NULL
PRINT '<<< TABLE [Import].[DimStudents] exists >>>'
ELSE
PRINT '<<< TABLE [Import].[DimStudents] does not exist >>>'
END

--c. K12DWStage.DW.DimStudentsStage
Use K12DWStage
go
IF OBJECT_ID('[DW].[DimStudentsStage]') IS NOT NULL
PRINT '<<< TABLE [DW].[DimStudentsStage] exists >>>'
ELSE
PRINT '<<< TABLE [DW].[DimStudentsStage] does not exist >>>'
END

--d. K12DW.DW.DimStudents
Use K12DW
go
IF OBJECT_ID('[DW].[DimStudents]') IS NOT NULL
PRINT '<<< TABLE [DW].[DimStudents] exists >>>'
ELSE
PRINT '<<< TABLE [DW].[DimStudents] does not exist >>>'
END

-----------------------------------------------------------------------
--e. Pamlico database fully created and populated
Use Pamlico
go
IF OBJECT_ID('[ETLAudit].[DataErrorSeverity]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[DataErrorSeverity] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[DataErrorSeverity] does not exist
>>>'
END
IF OBJECT_ID('[ETLAudit].[ETLContainerExecution]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLContainerExecution] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLContainerExecution] does not
exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLControl]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLControl] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLControl] does not exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLDataError]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLDataError] exists >>>'
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

ELSE
PRINT '<<< TABLE [ETLAudit].[ETLDataError] does not exist >>>'
END
IF OBJECT_ID('[ETLAudit].[ETLPackageExecution]') IS NOT NULL
PRINT '<<< TABLE [ETLAudit].[ETLPackageExecution] exists >>>'
ELSE
PRINT '<<< TABLE [ETLAudit].[ETLPackageExecution] does not exist
>>>'
END
--f. K12CentralHub.Annex fully created and annex tables loaded
--5. ValueMappingHandleCode variable set to 0 so that if records don’t
match To a valid value mapping they will still be processed.
select * from K12DWStage.ETL.SSISConfigurations where
configurationFilter = 'ValueMappingErrorHandleFlag'
-----------------------------------------------------------------------
--Triggers
--1. Control package for Hub kicks off package CMS_Load_Hub_Students

-----------------------------------------------------------------------
--Basic Course of Events:
--1. Hub SSIS CMS_Hub_ETL_Load_Students runs to load data to
K12CentralHub.Hub.Students
--2. Import SSIS CMS_PerformanceDw_ETL_Import_DimStudents runs to load
data to K12Import.Import.DimStudents
--3. Stage SSIS CMS_PerformanceDw_ETL_Transform_DimStudents runs to
load data to K12DWStage.DW.DimStudentsStage
--4. DW SSIS CMS_PerformanceDw_ETL_Load_DimStudents runs to load data
to K12DW.DW.DimStudents

-----------------------------------------------------------------------
--Alternative paths
--1. Data errors to Pamlico database table
select * from Pamlico.ETLAudit.ETLDataError where TaskName in
('SEQ Extract Students','DFL Import DimStudent','DFL Transform
DimStudent','DFL Load DimStudent')
-----------------------------------------------------------------------
--Post conditions
--1. Hub SSIS CMS_Hub_ETL_Load_Students completes successfully.
--2. Data is loaded into K12CentralHub.Hub.Students
Select count(1) from K12CentralHub.Hub.Students -- 238883

--3. Import SSIS CMS_PerformanceDw_ETL_Import_DimStudents completes


successfully.
--Import Source Query
Use K12CentralHub
go
SELECT
M.STUDENTS_ID
,M.PUPIL_NUMBER
,M.SEX
,M.MINORITY_CODE_1
,(SELECT LunchStatusCode FROM HUB.NUTRITION WHERE SchoolStudentID =
M.PUPIL_NUMBER) AS FreeReducedLunchCategoryCode
,(SELECT TOP 1 LEP_CODE FROM HUB.vStudentLEPStatus WHERE PUPIL_NUMBER =
M.PUPIL_NUMBER AND GETDATE() BETWEEN START_DATE AND End_Date ORDER BY ID
DESC) AS LEPStatus
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

,(SELECT TOP 1 ESL_CODE FROM HUB.vStudentESLStatus WHERE PUPIL_NUMBER =


M.PUPIL_NUMBER AND GETDATE() BETWEEN START_DATE AND End_Date ORDER BY ID
DESC) AS ESLStatus
,(SELECT TOP 1 CASE EXTERNAL_CODE WHEN N'MCV' THEN N'Yes' ELSE N'No' END
FROM HUB.PROGRAM_ASSIGNMENTS A INNER JOIN HUB.PROGRAM_TYPES
P
ON PROGRAM = PROGRAM_TYPE
WHERE PUPIL_NUMBER = M.PUPIL_NUMBER AND EXTERNAL_CODE = N'MCV'
AND (A.END_DATE IS NULL OR A.END_DATE > GetDate())) AS
HomelessnessStatus
,(SELECT TOP 1 IS_504 FROM HUB.STUDENT_PLANS WHERE PUPIL_NUMBER =
M.PUPIL_NUMBER AND GETDATE() BETWEEN START_DATE AND End_Date ORDER BY ID
DESC) AS Is504
,M.SURNAME
,M.FIRST_NAME
,M.SECOND_NAME
,M.LAST_NAME_SUFFIX
,(SELECT TOP 1 SURNAME FROM Hub.PARENTS AS PS
WHERE PS.PUPIL_NUMBER = M.PUPIL_NUMBER
AND PARENT_TYPE = 'B') AS StudentMaternalLastName

,M.WITHDRAW_DATE
,M.REGISTERED_INDICATOR
,M.GRADE
,M.EMAIL_ADDRESS
,M.PHONE
,(SELECT MUNICIPALITY_DESC FROM HUB.ESIS_MUNICIPALITIES WHERE
MUNICIPALITY = M.MUNICIPALITY) AS LocationCity
,M.BIRTH_DATE
,M.STREET_NAME
,M.STREET_NUMBER
,M.APARTMENT
,M.POSTAL_CODE
,M.PROVINCE
,M.MAILING_ADDRESS
,M.SCH_ENTRY_DATE_AT_COUNTRY
,M.CREATE_DATE
,M.RELEASE_OF_INFO
,M.INDIAN_ANCESTRY_CODE
,(SELECT TRIBAL_DESC FROM HUB.ESIS_TRIBAL_CODES WHERE TRIBAL_CODE =
INDIAN_ANCESTRY_CODE) AS NativeClanTribeName
,(SELECT TOP 1 MIGRANT_WORKER FROM Hub.PARENTS AS PS
WHERE PS.PUPIL_NUMBER = M.PUPIL_NUMBER
AND MIGRANT_WORKER = N'Y') AS MIGRANT_WORKER
,M.COUNTRY_OF_CITIZENSHIP
,M.COUNTRY_OF_BIRTH
,M.LANGUAGE_CODE_MOST_USED
,M.FIRST_LANGUAGE_CODE
,M.LANGUAGE_CODE
,(SELECT LANGUAGE_DESC FROM Hub.LANGUAGE_CODES WHERE LANGUAGE_CODE =
M.LANGUAGE_CODE) AS LANGUAGE_DESC
,0 AS NumberofDependents
,(SELECT TOP 1 Calendar_Date FROM HUB.CMS_Calendar
WHERE School_Day = 180
AND School_Year = (SELECT CR_SCHOOL_YEAR / 10000 + 10000 *
(CR_SCHOOL_YEAR / 10000 -1)
FROM HUB.CR_YEAR_PARAMS)) AS LastDay
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

FROM Hub.STUDENTS AS M WITH(NOLOCK)


WHERE M.ModifiedDate BETWEEN '2009-01-01 12:55:24' AND '2009-12-31
12:55:00'
and pupil_number in (8588913,8299579,8239395,8172849,8647934)

--4. Data is loaded in K12DWImport.Import.DimStudents


Select count(1) from k12dwimport.import.dimstudent -- 238883

--5. Transform SSIS CMS_PerformanceDw_ETL_Transform_DimStudents


completes successfully
--6. Data is loaded in K12DWStage.DW.DimStudentsStage
Select count(1) from k12dwstage.dw.dimstudentstage -- 238883

--7. Load SSIS CMS_PerformanceDw_ETL_Load_DimStudents completes


successfully.
--8. Data is loaded in K12DW.DW.DimStudents
Select count(1) from k12dw.dw.dimstudent -- 238884

--9. ETL logging records in Pamlico database


select * from Pamlico.ETLAudit.ETLContainerExecution where
ContainerInstance = 'SEQ Extract Students'
select * from Pamlico.ETLAudit.ETLPackageExecution where PackageInstance
= 'CMS_Hub_ETL_Load_Students'
select * from Pamlico.ETLAudit.ETLPackageExecution where PackageInstance
= 'CMS_PerformanceDW_ETL_Import_DimStudent'
select * from Pamlico.ETLAudit.ETLPackageExecution where PackageInstance
= 'CMS_PerformanceDW_ETL_Transform_DimStudent'
select * from Pamlico.ETLAudit.ETLPackageExecution where PackageInstance
= 'CMS_PerformanceDW_ETL_Load_DimStudent'
select * from Pamlico.ETLAudit.ETLSSISLog where packageName in

('CMS_Hub_ETL_Load_Students','CMS_PerformanceDW_ETL_Import_DimStudent','
CMS_PerformanceDW_ETL_Transform_DimStudent','CMS_PerformanceDW_ETL_Load_
DimStudent')

--10. Source table record count should equal or approximately equal


Target table record count.
-- esis ORACLE source
select count(1) from ADMASSIST.STUDENTS -- 238903 (on 9/15/2009
--dw target
Select count(1) from k12dw.dw.dimstudent -- 238884

--11. Select source data for 5 students and compare to target data for 5
students.
select * from k12dw.dw.dimstudent where StudentSISID in
('8588913','8299579','8239395','8172849','8647934')
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Appendix C – Use Case, Scenarios & Queries for Reports


System Integration Testing

UseCaseID Use Case Name Primary Actor Scope Complexity Priority


1 EOG by Period as Teacher SSRS in Low 1
2 EOG by Student as Teacher SSRS in Low 2
3 EOG as Teacher SSRS in Low 3
4 EOC by Period as Teacher SSRS in Low 4
5 EOC by Student as Teacher SSRS in Low 5
6 EOC as Teacher SSRS in Low 6
7 Student Summary as Teacher SSRS in Low 7
8 Overall First Sheet as Teacher SSRS in Low 8
9 Incoming Student Info as Teacher SSRS in Low 9
10 Student Attendance as Teacher SSRS in Low 10
11 EOG by Period as Principal SSRS in Low 11
12 EOG by Student as Principal SSRS in Low 12
13 EOG as Principal SSRS in Low 13
14 EOC by Period as Principal SSRS in Low 14
15 EOC by Student as Principal SSRS in Low 15
16 EOC as Principal SSRS in Low 16
17 Student Summary as Principal SSRS in Low 17
18 Overall First Sheet as Principal SSRS in Low 18
19 Incoming Student Info as Principal SSRS in Low 19
20 Student Attendance as Principal SSRS in Low 20
21 EOG Goals as Teacher SSRS in Low 21
22 EOG Objectives as Teacher SSRS in Low 22
23 EOC Goals as Teacher SSRS in Low 23
24 EOC Objectives as Teacher SSRS in Low 24
25 LEP as Teacher SSRS in Low 25
26 K-2 Assessments as Teacher SSRS in Low 26
27 DIBELS as Teacher SSRS in Low 27
28 EOG Goals as Principal SSRS in Low 28
29 EOG Objectives as Principal SSRS in Low 29
30 EOG Goals as Principal SSRS in Low 30
31 EOG Objectives as Principal SSRS in Low 31
32 Overall First Sheet as LC SSRS in Low 32
33 Academic Summary as LC SSRS in Low 33
34 EOC by Course as LC SSRS in Low 34
35 EOC Course by School as LC SSRS in Low 35
36 EOG by Grade as LC SSRS in Low 36
37 EOG Grade by School as LC SSRS in Low 37
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

38 AYP as LC SSRS in Low 28


39 AYP HS by Demographics as LC SSRS in Low 29
40 AYP El-MS by Demographics as LC SSRS in Low 20
41 Teacher Info as LC SSRS in Low 21
42 Teacher HS Detail as LC SSRS in Low 22
43 Teacher El-MS Detail as LC SSRS in Low 23
44 Non-Academic Summary as LC SSRS in Low 24
45 School Comparison Indices as LC SSRS in Low 25
46 Schools at a Glance as LC SSRS in Low 26
47 EOC by Course as District SSRS in Low 27
48 EOC Course by LC as District SSRS in Low 28
49 EOG by Grade as District SSRS in Low 29
50 EOG by LC as District SSRS in Low 27
51 AYP as District SSRS in Low 28
52 AYP by LC as District SSRS in Low 29
53 Teacher Info as District SSRS in Low 21
54 Teacher HS Detail as District SSRS in Low 22
55 Teacher El-MS Detail as District SSRS in Low 23
56 Non-Academic Summary as District SSRS in Low 24
57 School Comparison Indices as District SSRS in Low 25
58 Schools at a Glance as District SSRS in Low 26

Use Case Name: #1 EOG by Period


Version: 1
Date: 9/16/2009
Author: Gary Miller

Goal: Display EOG assessment scores and growth values for all class periods
taught in the assessed course by a given teacher for the given grade level.

Summary: Using SQL 2008 tables housed in the Data Warehouse K12DW, and
accessed exclusively through stored procedures.

Actors: SQL 2008 Database K12DW, SQL 2008 Reporting Services.

Preconditions/Assumptions:
1. Connectivity to K12DW server;
2. The stored procedures are current and available;
3. Connectivity to SQL 2008 CMS MP databases;
4. Managing for Performance databases and tables are populated.

Triggers: Manually selected in Sharepoint Portal or Report Manager.


Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Basic Course of Events:


1. The user selects the report to run from the user interface.
2. Before the report will run, the user must enter specific criteria using
the parameter controls located at the top of the Report Viewer.
3. Once all criteria required to run the report are set, the user clicks the
now-enabled button “View Report” in order to run it.
4. The contents of the report will begin to display as soon as any data
returns from the stored procedure.

Alternative Paths: Should there be no data returned, the report will display a message
indicating that it found no records that match the selected criteria.

Post Conditions: None.

Business Rules: The user may view only his or her own data.

SQL for spEOGbyPeriod:

USE [K12DW]
GO

/****** Object: StoredProcedure [dbo].[spEOGbyPeriod] Script Date:


09/16/2009 16:35:54 ******/
IF EXISTS (SELECT * FROM sys.objects WHERE object_id =
OBJECT_ID(N'[dbo].[spEOGbyPeriod]') AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[spEOGbyPeriod]
GO

USE [K12DW]
GO

/****** Object: StoredProcedure [dbo].[spEOGbyPeriod] Script Date:


09/16/2009 16:35:54 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id =


OBJECT_ID(N'[dbo].[spEOGbyPeriod]') AND type in (N'P', N'PC'))
BEGIN
EXEC dbo.sp_executesql @statement = N'CREATE procedure [dbo].
[spEOGbyPeriod]
(@SchoolID int, @StaffID int, @AcademicYear bigint, @AssessmentID int)

/***********************************************************
Purpose: returns class counts by period and course along
with average expected scale score, average EOG scale score.

Change Log:
Date Developer Change Description
20090507 Duane Simes Initial Version
***********************************************************/
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

AS
declare @tests table
(
SchoolID int,
StaffID int,
TestID int,
DateID int
)
insert into @tests
select distinct
EnrolledSchoolID,
StaffID,
AssessmentID,
DateID
from DW.FactStudentAssessment fsa
where EnrolledSchoolID = @SchoolID
and StaffID = @StaffID
and AssessmentID = @AssessmentID
and exists
(
select null
from DW.DimDate dd
where fsa.DateID = dd.DateID
and AcademicYear = @AcademicYear
)

declare @fact table


(
CourseID int,
ClassID int,
AssessmentID int,
StudentID int,
ScenarioID int,
ScaleScore decimal(9,6)
)

insert into @fact


SELECT
CourseID,
ClassID,
AssessmentID,
StudentID,
ScenarioID,
ScaleScore
FROM
DW.FactStudentAssessment sa
inner join
@tests t
on sa.EnrolledSchoolID = t.SchoolID
AND sa.StaffID = t.StaffID
AND sa.AssessmentID = t.TestID
and sa.ScenarioID is not null

select
CourseID,
Period,
CourseTitle,
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Section,
StudentCount,
avg([ExpectedScore]/StudentCount) as [ExpectedScore],
avg([ActualScore]/StudentCount) as [ActualScore],
avg([ExpectedGrowth]/StudentCount) as [ExpectedGrowth],
avg([ActualGrowth]/StudentCount) as [ActualGrowth],
GrowthCount
FROM
(
SELECT
Period,
CourseID,
CourseTitle,
NULL AS SECTION,
COUNT( DISTINCT StudentID) AS StudentCount,
SUM(ExpectedScaleScore) AS [ExpectedScore],
SUM(ScaleScore) AS [ActualScore] ,
SUM(ExpectedGrowth) AS [ExpectedGrowth],
SUM(Growth) AS [ActualGrowth],
SUM (CASE
WHEN growth > 0 AND ScenarioCode = ''G''
THEN 1
ELSE 0 END) AS GrowthCount
FROM
(
SELECT
cl.ClassPeriod AS Period,
co.CourseID,
co.CourseTitle,
sa.AssessmentID,
sa.StudentID,
DimScenario.ScenarioCode,
CASE
WHEN DimScenario.ScenarioCode = ''ES''
THEN sa.ScaleScore
ELSE 0.00
END AS ExpectedScaleScore
,CASE
WHEN DimScenario.ScenarioCode = ''R'' THEN
sa.ScaleScore
ELSE 0.00
END AS ScaleScore
,CASE
WHEN DimScenario.ScenarioCode = ''EG''
THEN sa.ScaleScore
ELSE 0.00
END AS ExpectedGrowth
,CASE
WHEN DimScenario.ScenarioCode = ''G'' THEN
sa.ScaleScore
ELSE 0.00
END AS Growth
FROM
@fact sa
JOIN DW.DimScenario
ON sa.ScenarioID =
DW.DimScenario.ScenarioID
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

JOIN DW.DimClass cl
ON sa.ClassID = cl.ClassID
JOIN DW.DimStudent
ON sa.StudentID =
DW.DimStudent.StudentID
JOIN DW.DimCourse co
ON sa.CourseID = co.CourseID
) AS Driver
GROUP BY
CourseID,
Period,
CourseTitle
) AS eog
GROUP BY
CourseID,
Period,
CourseTitle,
Section,
StudentCount,
GrowthCount
ORDER BY CourseID, Period
'
END
GO
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Appendix D – Use Case, Scenarios & Queries for Portals System


Integration Testing
Charlotte-Mecklenburg Schools – Managing for Performance Project Quality Assurance Plan

Appendix E – Use Case, Scenarios & Queries for Performance


Planning Application Systems Integration Testing

You might also like