Professional Documents
Culture Documents
Cdcaas SPMP 1.0
Cdcaas SPMP 1.0
Cdcaas SPMP 1.0
Version 1.0
Team Crime Busters Sonal Verma Brian H. Park Akhil Pathania John Kraus Software 625 -Software Project Management Software Professor Ken Nidiffer Submission Date: December 8, 2008
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Revisions
Revision Number Date Section Name Description Section Undated Remarks
0.1
9/11/2008
Preface and 1.1 Project Overview Preface and 1.1 Project Overview Project Description Project Description Requirement Estimate Process Model Productivity Dependencies Reference Materials Complete
First draft Update document based on team review Added Figures Updated the package description First Draft First Draft First Draft First Draft Final Draft
Preface and 1.1 Preface, 1.1.1 and 1.1.2 All 1.1.1 5.3 2.1 3.4.4 5.2 1.4 Decided project name and team name
0.1
9/13/2008
ii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Preface
The following preface is a short narrative on the purpose of the Crime Data Collection, Aggregation and Assimilation System (CDCAAS) project. The CDCAAS will provide police officers and investigators with the ability to conveniently and quickly acquire and assimilate, from a variety of diverse electronic sources, information needed to help solve or prevent crimes or hunt down and apprehend criminals. CDCAAS will be a valuable tool for investigators seeking to connect the dots in their efforts to protect the public and bring wrongdoers to justice. CDCAAS is the first in a new series of applications software suites for System XYZ. System XYZ will be the next major product line for our company, making this project and this Software Project Management Plan (SPMP) extremely important to the continued success of the organization. Furthermore, the CDCAAS project embodies our companys ambitious set of strategic redirection actions to strengthen our responsiveness to customers, increase competitiveness by reducing costs, build market share, and field quality products that meet or exceed customer requirements. Our recent market study concluded that the features offered by the proposed set of software packages for the new System XYZ computer system will be crucial in achieving our goals for market penetration. This conclusion has been borne out by 700 advance orders received for the CDCAAS.
iii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Abstract
The Software Project Management Plan (SPMP) contains detail documentation of the technical and management aspects involved in various stages in the development of the Crime Date Collection, Aggregation and Assimilation System (CDCAAS). The document adheres to the specification provided by the IEEE Standards (1058.1). This document serves as a controlling document for organizing and managing the resources in such a way that these resources deliver all the work required to complete the project within define scope, time and cost constraints. The document is intended to answer various challenges involved in managing the project like meeting project deadlines, integration of inputs needed to meet predefined objectives of providing a cost effective, efficient and automated way of assimilating data about the criminals from one place rather than from different resources. This document will be subject to revisions as the project progresses.
iv
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Project Charter
Sponsor Project Sponsor Information Name: Susan Smith Title: Vice President, New Software and Systems Engineering Business Need:
Development The Crime Data Collection, Aggregation and Assimilation system (CDCAAS) will provide enhanced functionality for more efficient searches for the crime data stored in the different kinds of biometric databases. This system shall be used by country wide law enforcement agencies. The primary goal is to provide users with a unified tool that will have the capability to work of different platforms, understand all secure formats, and communicate over broad range of medium without compromising on the stringent level of security that is specific to the government agencies and other corporations. Secondary goals include reduction in operational cost and increase in profit margin by providing stable and reliable software. System XYZ will allow the corporation to become the leading provider of Crime Data Collection, Aggregation and Assimilation software at all levels of government and in the commercial sector Continued improvement in the quality of life Improved overall data integrity and speed of data access to optimize decision making Faster and Higher rates case-closing through more efficient gathering and analysis of crime- related data. Table 1: Project Sponsor Information
Business Benefits:
Major Deliverables Product(s) or Service(s) Data Management Analysis and Statistics Schedule Constraints and Assumptions Planned Start Date:
January 28, 2008
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Major Deliverables Schedule Assumptions: CDCAAS will be developed during the next 24 months with many intermediate product releases during the above mentioned period. The alpha test is scheduled on or before Jan 28, 2010. Schedule Allocation is as follows: 24 months CDCAAS development 2 months alpha test at Fairfax, VA (in-house) 2 months beta test at Chelmsford, MA (customer site) 2 months rollout to customer sites which includes installation and training 6 months maintenance
Table 2: Major Deliverables
Schedule Constraints:
vi
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Staff Months 93.31 41.14 174.39 185.41 41.11 98.27 41.13 135.15 41.09 102.50 76.54
Cost $2,985,839.13 1,316,356.25 5,580,610.84 5,933,252.92 1,315,393.59 3,144,759.78 1,316,050.46 4,324,790.21 1,315,009.16 3,279,969.71 2,449,435.62
1,030.05 $32,961,467.66
System Kernel Software (assuming Kernel size is 20K SLOC) COCOMO man-month effort = 7.39 * (KSLOC) ^ 1.2 Total Effort (Staff Months) = Total Effort (Staff Hours) = Labor cost per hour = 269.08 43,052.70 $200.00 Kernel Cost =
$8,610,540.45
vii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
System XYZ Hardware Equation 2: Cost per unit * No. of Units Cost per unit = $1,500 No. of Units = 700 Hardware Cost = $1,050,000.00
Subtotal = $42,622,008.12
Table 5-Initial Cost Estimate for CDCAAS
viii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Project Scope
The Crime Data Collection, Aggregation and Assimilation System (CDCAAS) project provides police agencies such as the Federal Bureau of Investigation (FBI), Interpol, Department of Homeland Security, Law Enforcement and Corporations the capability for a secure, efficient and reliable mechanism to acquire detailed crime-solving related information in a cost-effective and time saving manner. All CDCAAS software will be designed, implemented, purchased, or subcontracted during the life of this project. This project will extend the existing System XYZ software packages divided into various methods of implementation. Three packages will be Commercial-Off-The-Shelf (COTS) products. Three packages will leverage reusable components as building blocks. Five packages will be custom designed and developed. One of the five custom packages will be outsourced to Ivan Industries. All sub-phases of the project including requirements, preliminary and detailed design, code, and unit testing will be conducted during the CDCAAS project. Subsequently, the CDCAAS project will conduct alpha and beta tests and lead the rollout of the software into production. Installation and user-training will be provided by the CDCAAS project. Additionally, this project will provide 24 hour help-desk service and support for a six-month maintenance period after completing installations at each site. Exclusions to the project include any connectivity or supporting hardware installations and delivery that are necessary for the CDCAAS software to function. Other notable exclusions include: Multi-Language Translation Packages, ability to edit CDCAAS Documents using other software, staffing support for Ivan industries, and extended support and maintenance plans. CDCAAS Hardware (Notebook Configuration): Two Universal 2009-B microprocessors, 3.4 GHz 20.1 inch display A three-button mouse point device 4 GB of main memory (SDRAM) 16 MB Video RAM 320 GB primary hard drive DVD/R/RW AMD CD-RW Combo Drive A 100 MB ZIP drive Integrated 2.0 MP camera Printer port Asynchronous port Integrated 802.11 g wireless LAN Four USB ports 1394/FireWire connector A LAN interface card ix
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
A 56,000 bps capable fax/modem Anti-virus/security suite Video Card Speakers Head phones Laser printer 9-cell lithium ion battery MP3 30 GB audio/video player A Bar Code scanner Personal Assistant Device (416 megahertz, 1 GB RAM) 48-bit Color Flatbed Scanner (2400 dpi optical resolution) System ABC enhanced keyboard A CRT monitor (1,280 x 1,024 non-interlaced; high resolution; bit-mapped; 21 inch color display) Port replicator A stand for the CRT monitor Power connector Digital camcorder Internal Speakers Wireless Digital Phone with voice mail messaging and internet service CDCAAS Database Management System CDCAAS Spreadsheet CDCAAS Requirements and configuration Management CDCAAS Secure Communication CDCAAS Graphics Presentation CDCAAS Word Processor CDCAAS Project Management CDCAAS GPS Navigation CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Table of Contents
Revisions ........................................................................................... ii Preface ............................................................................................. iii Abstract ............................................................................................ iv Project Charter .................................................................................... v Initial Cost Estimate for CDCAAS .............................................................. vii Project Scope...................................................................................... ix Table of Contents ................................................................................. xi Table of Figures ..................................................................................xiv Index of Tables .................................................................................. xvii 1. Introduction ..................................................................................... 1 1.1. Project Overview .......................................................................... 1 1.1.1. Project Description................................................................... 1 1.1.2. Product Summary ................................................................... 10 1.2 Project Deliverables ..................................................................... 15 1.2.1 Software Applications, Computer Software Configuration Item (CSCI)...... 15 1.2.2 Delivery Locations and Quantities ................................................ 16 1.2.3 Documentation....................................................................... 17 1.2.4 Delivery Customer Acceptance .................................................... 17 1.3 Evolution of the Software Project Management Plan ............................... 18 1.4 Reference Materials ...................................................................... 19 1.5 Definitions and Acronyms ............................................................... 19 1.5.1 Definitions ............................................................................ 19 1.5.2 Acronyms ............................................................................. 22 2. Project Organization ......................................................................... 27 2.1 Process Model ............................................................................. 27 2.1.1 Process Milestones .................................................................. 27 2.1.2 Baselines .............................................................................. 30 2.1.3 Reviews ............................................................................... 30 2.2 Organizational Structure ................................................................ 33 2.3 Organization Boundaries and Interfaces .............................................. 40 2.4 Project Responsibilities.................................................................. 42 2.4.1 Project Manager ..................................................................... 42 2.4.2 Assistant Project Manager.......................................................... 42 2.4.3 Chief Software Developer .......................................................... 42 2.4.4 Administrative Assistant ............................................................ 42 2.4.5 System Engineer/Analyst ........................................................... 42 2.4.6 Requirements Analysts.............................................................. 42 2.4.7 Technical Team Leader............................................................. 42 2.4.8 Software Developers ................................................................ 42 2.4.9 Testers ................................................................................ 43 2.4.10 Help Desk Technician .............................................................. 43 2.4.11 Project Specialists ................................................................. 43 xi
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
3. Managerial Process ........................................................................... 57 3.1 Management Objectives & Priorities ................................................. 57 3.1.1 Goals & Objectives .................................................................. 57 3.1 Management Objectives & Priorities ................................................. 58 3.1.1 Goals & Objectives .................................................................. 58 3.1.2 Management Priorities .............................................................. 60 3.1.2 Management Priorities .............................................................. 61 3.2 Assumptions, Dependencies and Constraints ......................................... 62 3.2.1 Assumptions .......................................................................... 63 3.2.2. Dependencies ....................................................................... 63 3.2.3 Constraints ........................................................................... 63 3.2 Assumptions, Dependencies and Constraints ......................................... 64 3.2.1 Assumptions .......................................................................... 64 3.2.2. Dependencies ....................................................................... 64 3.2.3 Constraints ........................................................................... 65 3.3 Risk Management ......................................................................... 65 3.4 Monitoring and Controlling Mechanism ................................................ 67 3.4.1 Schedule .............................................................................. 67 3.4.2 Budget ................................................................................ 69 3.4.3 Quality Assurance ................................................................... 70 3.4.4 Productivity .......................................................................... 72 3.4.6 Measures .............................................................................. 76 3.5 Staffing Plan .............................................................................. 79 3.5.1 Obtaining ............................................................................. 79 3.5.2 Training ............................................................................... 79 3.5.3 Retaining.............................................................................. 79 3.5.4 Phasing out of personnel ........................................................... 80 3.5.5 Staff Expertise ....................................................................... 96 4. Technical Process............................................................................. 98 4.1 Methods, Tools & Techniques........................................................... 98 4.2 Software Documentation ............................................................... 101 4.3 Project Support Functions ............................................................. 105 4.3.1 Configuration Management ....................................................... 105 4.3.2 Quality Assurance .................................................................. 107 4.3.3 Verification and Validation ....................................................... 108 4.3.4 Test Evaluation ..................................................................... 109 4.3.4 Test and Evaluation ................................................................ 109 5. Work Packages, Schedule, and Budget ................................................... 110 5.1 Work Packages ........................................................................... 110 5.1.1 Work Breakdown Structure ....................................................... 110 5.1.2 Work Package Specifications ..................................................... 116 5.2 Dependencies ............................................................................ 118 5.3 Resource Requirement.................................................................. 136 5.3 Resource Estimate....................................................................... 145 xii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
5.4 Budget and Resource Allocation....................................................... 159 5.5 Schedule .................................................................................. 162 Additional Components ........................................................................ 172 1.1. Subcontracting Process ................................................................ 172 1.1.1 Selection of Subcontractors....................................................... 172 1.1.2. Coordinating with Subcontractors .............................................. 174 1.1.3. Integrating with Subcontractors................................................. 174 1.1.4. Controlling Subcontractors ....................................................... 175 2.1. Security Considerations................................................................ 175 3.1. Training Plans ........................................................................... 176 4.1. Alpha & Beta Test Plan ................................................................ 176 4.1.1. Alpha Testing ...................................................................... 176 4.1.2. Beta Testing ........................................................................ 178 5.1. Installation & Training Plans .......................................................... 178 6.1. Post Deployment Support Procedures ............................................... 179 Index .............................................................................................. 180 Appendix I........................................................................................... 1 Detailed Resource Estimate Spreadsheet (23 columns plus intermediate calculations) attached. ....................................................................... 1 Appendix II .......................................................................................... 1 Detailed Resource Estimate Spreadsheet with formulas revealed--attached. ......... 1 Appendix III ......................................................................................... 1 Work Package Specifications .................................................................. 1 Configuration Management ................................................................. 1 Communication ............................................................................... 2 Graphic Presentation ........................................................................ 3 Word Processing .............................................................................. 4 Project Management ......................................................................... 5 Global Positioning System Gps.............................................................. 6 Compile/Link/Runtime ...................................................................... 7 Language Independent Debugging And Testing .......................................... 8 Electronic Inventory and Tracking ......................................................... 9 Binder Back Cover ............................................................................... 10
xiii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Table of Figures
Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure 1: Operational Architecture ............................................................. 3 2: Network Architecture.................................................................. 4 3: Product Technical Architecture ...................................................... 5 4: Software Architecture ................................................................. 6 5: Notebook Hardware Architecture .................................................... 7 6: Standalone Environment .............................................................. 8 7: Mobile Client/Server Architecture Only ............................................ 8 8: Client/Server Architecture with Central Location ................................ 9 9: Activities, Benchmarks and Success Indicators .................................. 30 10: CDCAAS Organization Chart ....................................................... 34 11: Software Division Organization Chart ............................................ 35 12: CDCAAS Project Organization Chart.............................................. 36 13: Analysis Design and Development Team......................................... 38 14: CDCAAS Project Team Structure.................................................. 39 15: Program Manager Organizational Interfaces .................................... 40 16: Project Manager Organizational Interfaces ..................................... 41 17: Responsibility Matrix Summary ................................................... 44 18: Total Project Responsibility Matrix .............................................. 45 19: Database Management System Package Responsibility Matrix ............... 46 20: Compiler Package Staffing ........................................................ 81 21: GPS Navigation Package Staffing ................................................. 83 22: Graphics Package Staffing ......................................................... 87 23: Total Project Staffing Chart ....................................................... 89 24: Total Project Staffing by Package ................................................ 91 26: Document Production Process Flow Chart ..................................... 104 27: Work Breakdown Structure (Overall) ........................................... 110 28: Figure 27: Work Breakdown Structure (Secure Communication) ........... 111 29: WBS GPS Navigation ............................................................... 111 30: WBS Database Management System ............................................. 112 31: WBS Spreadsheet .................................................................. 113 32: Dependencies (Part 1) ............................................................ 118 33: Dependencies (Part 2) ............................................................ 119 34: Dependencies (Part 3) ............................................................ 120 35: Dependencies (Part 4) ............................................................ 121 36: Dependencies (Part 5) ............................................................ 122 37: Dependencies (Part 6) ............................................................ 123 38: Dependencies (Part 7) ............................................................ 124 39: Dependencies (Part 8) ............................................................ 125 40: Dependencies (Part 9) ............................................................ 126 41: Phase Distribution (Project Overall) ............................................ 127 42: : Phase Distribution (Project Plans & Requirements)......................... 127 43: : Phase Distribution (Project Programming) ................................... 128 xiv
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure
44: 45: 46: 47: 48: 49: 50: 51: 52: 53: 54: 55: 56: 57: 58: 59: 60: 61: 62: 63: 64: 65: 66: 67: 68: 69: 70: 71: 72: 73: 74: 75: 76: 77: 78: 79: 80: 81: 82: 83: 84: 85: 86: 87:
Distribution (Project Product Design)........................................... 128 Phase Distribution (Project Integration & Test) ............................... 129 Project Maintenance (Part 1) .................................................... 129 Project Maintenance (Part 2) .................................................... 130 Project Maintenance (Part 3) .................................................... 130 Project Maintenance (Part 4) .................................................... 131 Module Overall (Database Management) ....................................... 131 Module Plans & Requirements (Database Management) ..................... 132 Module Programming (Database Management) ................................ 132 Module Product Design (Database Management) .............................. 133 Module Integration & Test (Database Management) .......................... 133 Module Maintenance (Database Management Part 1) ....................... 134 Module Maintenance (Database Management Part 2) ....................... 134 Module Maintenance (Database Management Part 3) ....................... 135 Module Maintenance (Database Management Part 4) ....................... 135 Resource Loading Chart 1......................................................... 136 Resource Loading Chart 2......................................................... 136 Resource Loading Chart 3......................................................... 137 Resource Loading Chart 4......................................................... 137 Resource Loading Chart 5......................................................... 138 Resource Loading Chart 6......................................................... 138 Resource Loading Chart 7......................................................... 139 Resource Loading Chart 8......................................................... 139 Resource Loading Chart 9......................................................... 140 Resource Loading Chart 10 ....................................................... 140 Resource Loading Chart 11 ....................................................... 141 Resource Loading Chart 12 ....................................................... 141 Resource Loading Chart 13 ....................................................... 142 Resource Loading Chart 14 ....................................................... 142 Resource Loading Chart 15 ....................................................... 143 Resource Work Summary Report Chart ......................................... 143 Resource Work Availability Report Chart ....................................... 144 Software Productivity (SLOC/SM) by Application Domains .................. 146 COCOMO Summary Screen ........................................................ 149 COCOMO Product Parameters .................................................... 150 COCOMO Platform Parameters ................................................... 150 COCOMO Personnel Parameters ................................................. 151 COCOMO Project Parameters .................................................... 151 COCOMO Scale Parameters ....................................................... 152 COCOMO Equation Parameters ................................................... 152 COCOMO Phase Distribution-Project Overall................................... 153 COCOMO Phase Distribution-Project Plans & Requirements ................. 153 COCOMO Phase Distribution-Project Programming ........................... 154 COCOMO Phase Distribution-Project Product Design ......................... 154 xv
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Figure 88: COCOMO Phase Distribution-Project Integration & Test ..................... 155 Figure 89: Resource Estimate Spreadsheet Part 1 ......................................... 157 Figure 90: Resource Estimate Spreadsheet Part 2 ......................................... 158 Figure 91: CDCAAS System Master Schedule ................................................ 162 Figure 92: Debugger & Test Schedule........................................................ 163 Figure 93: CDCAAS Electronic Inventory & Tracking/Custom Development Detailed Schedule ......................................................................................... 164 Figure 94: CDCAAS COTS Development Detailed Schedule ............................... 165
xvi
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer
Index of Tables
Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table 1: Project Sponsor Information .......................................................... v 2: Major Deliverables ..................................................................... vi 3: Key Staffing Requirements ........................................................... vi 4: Other Constraints and Assumptions ................................................. vi 5-Initial Cost Estimate for CDCAAS ................................................... viii 6: Profile of Typical Product Users .................................................... 13 7: Software Applications, Computer Software Configuration Item (CSCI) ....... 16 8: Delivery Locations and Quantities .................................................. 17 9: Delivered Documentation ............................................................ 17 10: Spreadsheet Package Responsibility Matrix ...................................... 47 11: Requirements & Configurations Package Responsibility Matrix ............... 48 12: Secure Communications Package Responsibility Matrix ........................ 49 13: Graphics Extension Package Responsibility Matrix.............................. 50 14: Word Processor Package Responsibility Matrix .................................. 51 15: Project Management Package Responsibility Matrix ........................... 52 16: GPS Navigation Package Responsibility Matrix .................................. 53 17: Compiler Package Responsibility Matrix.......................................... 54 18: Debugger and Test Package Responsibility Matrix .............................. 55 19: Electronic Inventory and Tracking Package Responsibility Matrix ............ 56 20: Risk Management Description and Mitigation Strategies ....................... 67 21: Compiler Package Staffing ......................................................... 82 22: GPS Navigation Package Staffing .................................................. 84 23: Secure Communication Package Staffing ........................................ 86 24: Graphics Package Staffing .......................................................... 88 25: Total Project Staffing ............................................................... 90 25: Total Project Staffing by Package................................................. 92 27: Document Table .................................................................... 104 28: Work Breakdown Structure ........................................................ 115 29: Resource Estimate-Method 1...................................................... 147 30: Resource Estimate-Method 2...................................................... 148 31: Resource Estimate-Method 3...................................................... 148 32: Budget and Resource Allocation .................................................. 161 33: Current Price-Breakeven Table ................................................... 168 34: Decreased Price Breakeven Table ................................................ 169 35: Increased Price-Breakeven Table ................................................ 170 36: Optimum Price-Breakeven Table ................................................. 171 37: Training Plans ....................................................................... 176
xvii
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
1. Introduction
1.1. Project Overview 1.1.1. Project Description
Government Law Enforcement Agencies face new challenges everyday and to outsmart these challenges their dependency on technology is growing proportionally. Crime Data Collection, Aggregation and Assimilation System (CDCAAS) are a tool designed with enhanced functionality which will give these agencies an upper hand against Criminals. Our objective is to come up with a unified tool that will have the capability to work of different platforms, understand all secure formats, and communicate over broad range of medium without compromising on the stringent level of security that is specific to the government agencies and other corporations. CDCAAS will provide the corporate user with a pool of utilities that enables to take control of various technical and managerial aspects of the functionality, information control, data abstraction, easy access to information and scheduling. The system will also have the capability to be customized depending upon the changing corporate needs. The outcome of the project will be a unified system providing the government agencies with all the information ever collected for the entity under the scanner. Our goal of delivering the project on time with the committed capabilities, will write a new chapter of success for us. Capabilities delivered will include access control, identification, and data mining. Delivery of the system capabilities to the 700 advanced orders and subsequent orders will give the corporation a foot hold with financial stability. In addition, for each of the sites that have preordered systems we must install at least 50% of the systems and provide training on our application packages for 50 users. The enhancement for this application will involve producing additions to the following packages: 1. CDCAAS Database Management System - Custom Developed 2. CDCAAS Spreadsheet -COTS 3. CDCAAS Requirements and Configuration Management - Reuse 4. CDCAAS Secure Communication - Custom Developed 5. CDCAAS Graphics Presentation - COTS 6. CDCAAS Word Processor - Reuse 7. CDCAAS Project Management - COTS 8. CDCAAS GPS Navigation - Outsourced to Ivan Industries 9. CDCAAS Compiler - Reuse 1
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
10. CDCAAS Debugger & Test - Custom Developed 11. CDCAAS Electronic Inventory and Tracking - Custom Developed This system is designed to be marketable to a wide variety of clients as follows: It can be used at all levels of government from the federal government level organizations such as DHS, FBI, CIA and DOD all the way down to local law enforcement level. It can be customized for implementation in any corporation whether big or small to provide a one stop resource to manage information access and control. Wide customization capability of the CDCAAS system in addition to providing control, abstraction, access, and management of information makes it a global performer, and should make us a preferred choice the world over in the years to come. The Department of Defense provides a set of guidelines known as Security Technical Implementation Guides (STIG) that provide a security configuration baseline; systems must meet to be used within the Department of Defense. By following these guidelines less time and money will be required for integration by our team and by our federal customers, increasing the competitiveness and marketability of our product.
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Each of these functions will be built using the following packages: General Purpose Database Package 10
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Spreadsheet Package Requirements and Configuration Management Package Secure Communication Package Graphics Presentation Package Word Processing Package Project Management Package GPS Navigation Package Compile/Link/Runtime Packages for BASIC, C, and C++ Language Independent Debugging and Testing Package Electronic Inventory and Tracking Package
Database Package: This package will provide the user not only with the bare bone functionality of designing tables, but also allow advanced users to import/export data in different files and formats. Package being supported by a high end graphical interface will allow the users to manage the database within fraction of the time as taken by present day systems. The package will be able to handle concurrent user connections whether local or remote. This is OLE, ODBC and InnoDB compliant. Spreadsheet Package: Provide a custom 'spreadsheet wizard' which will allow data analysts or other users the customize reports for information updates, identification and alerts in a generated reports. This package must be able to handle a minimum of one hundred concurrent user connections and must be OLE and ODBC compliant. Requirements and Configuration Management Package: Its ability to manage information scheduling, and providing contact information for users such as security personnel and points of contact eliminates the need of any other application to be used in conjuncture. This must be OLE, ODBC and InnoDB compliant. Secure Communications Package: Package has been designed to allow secure transfer of data packets from the input devices to the system and between systems. Keeping in mind the increase in security vulnerabilities every single day, the package has been fitted with self updating mechanism which will keep a tab on new threats and update itself to counter them. Information Exchange Management Package: This package enables the user to have a single point of resource to access all information available. It acts as a bridge between all the databases and allows conversion of one data format to another. This must be OLE, ODBC and InnoDB compliant. Graphics Presentation Package: High end graphical interface enables the user to spend less time on configuration and handling. The package is also capable of presenting the generated reports in different graphical formats. Word Processing Package: Provides generic macros and custom 'word processor wizard' which will allow the user to create custom and preformatted reports based on data and logs in the system.
11
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Project Management Package: This provides project managers with the ability to generate real time information in the form of resource estimation and usage reports. This also allows them to generate other information such as WBS (Work breakdown Structures), CPM (Critical Path Method), PERT (Program Evaluation and Review Technique) and Gantt charts, etc available and used to develop CDCAAS. This must be OLE, ODBC and InnoDB compliant GPS Navigation Package: Enables the dependent modules to study the patterns generated based on fetched information and then present in various graphical formats. This package must be OLE, ODBC and InnoDB compliant. Compile Link and Runtime Package: Provides the necessary additional software libraries to perform executable building functions from C, C++, Basic Programming Languages to Java and Flash. Also provides additional libraries for utilizing the secure communications package to communicate over the network. Language Independent Debugging and Testing Package: Provides wizards to perform usability and functionality testing through automation scripts. This also gives System Administrators the ability to debug problems on the fly in production systems deployed in the field. Electronic Inventory and Tracking Package: User will be able to track hardware and any other movable equipment. The user can log issues, comments and will be permitted to generate any number of tracking statuses to be used.
12
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Data Analysts
Profile of Typical Product Users An individual who has restricted ability to change existing data. An analyst enters and modifies An analysts enters and modifies biometrics and DNA data. This role is usually performed by a role is usually performed by a forensic biologist. Perform real-time search to solve the criminal case.
Table 6: Profile of Typical Product Users
CDCAAS Notebook Bundled Devices: Speakers Head phones Laser printer 9-cell lithium ion battery MP-3 30 GB audio/video player A Bar Code scanner 13
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Personal Assistant Device (416 megahertz, I GB RAM) 48-bit Color Flatbed Scanner (2400 dpi optical resolution) System ABC enhanced keyboard A CRT monitor (1,280 x 1,024 non-interlaced; high resolution; bit-mapped; 21 inch color display) Port replicator A stand for the CRT monitor Power connector Digital camcorder Internal Speakers Wireless Digital Phone with voice mail messaging and Internet service
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The following risk factors are only those assumed at the time of current version development. The list will be continually updated Inadequate requirements (e.g., unstable, conflicting, poorly defined) Scope creep (e.g., continuing stream of additional requirements) Unrealistic or dynamic schedules and budgets Shortfalls in qualified personnel (e.g., unavailable when needed, novice when expert needed) Delay in the procurement of software resources critical to project success Budget Overrun. Shortfalls in externally performed tasks (e.g., subcontractor failure to deliver, late delivery) Shortfalls in externally furnished components (e.g., COTS software, hardware components) Shortfalls in performance capability (e.g., inability of databases to scale-up)
Risks for each task are documented in the monthly Program Management Reviews. The risk tables include the identified risk; a qualitative assessment of whether the risk is considered low, medium, or high based on its likelihood of occurrence and impact on the project; mitigation activities; and, whether the risk has changed status since it was reviewed at the previous PMR. All participants in the PMR are responsible for reviewing the risks in their functional areas and updating them as necessary.
1.2 Project Deliverables 1.2.1 Software Applications, Computer Software Configuration Item (CSCI)
Application and Acquisition Documentation Type delivery date Custom 28 January 2010 COTS 28 January 2010
Code (CSCI-08-NO1)
(CSCI-08-N02)
Function Stores data individuals and hardware View and create spreadsheets
15
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS Software Product Name Requirements and Configuration Management Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation
Code (CSCI-08-N03)
Function Email and appointment management Facilitate secure connection to remote networks Generate and view graphical reports Generate and view written reports Generate and control program plans Monitor and track hardware and individuals Develop CDCAAS addons and plug-ins Debug and test CDCAAS add-ons and plug-ins Inventory and track hardware and individuals
Application and Acquisition Documentation Type delivery date Reuse 28 January 2010
(CSCI-08-N04)
Custom
28 January 2010
COTS
(CSCI-08-N09) (CSCI-08-N10)
28 January 2010 Reuse 28 January 2010 COTS 28 January 2010 Outsourced 28 January to Ivan 2010 Industries Reuse 28 January 2010 Custom 28 January 2010 Custom 28 January 2010
(CSCI-08-N11)
16
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
1.2.3 Documentation
Document Requirements Specification Detailed Design Documents Documented Source Code Test Plan and Test Cases Test Results (including performance benchmarks) Traceability Matrices User Manuals Training Manuals (including a Getting Started User's Guide) Installation Instructions Maintenance Guide Version Description Document COTS Custom X X X X X X X X X X X X X X X X X X X X X X X X Re-Use Out-Source X X X X X X X X X X X
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
available in final version for alpha testing on August 29, 2008. COTS documentation will be provided in the form it is available from the vendor(s). The Requirements Specification will detail how the COTS applications are expected to work based on open source documented vendor claims and how they will be integrated in the extension to CDCAAS. Change pages will be made for reuse documentation only for required changes made in the following applications: CSCI-08N03, CSCI-08-N06, and CSCI-08-N09. Outsource application and documentation for CSCI-08-N08 will be provided as available during the series of internal product releases. During the two-month beta test, all applications and documentation will be subject to signed-off customer acceptance based on the set of test performance benchmarks corresponding to the system requirements. The procedures for unmet performance benchmarks will be detailed in the Software Project Management Plan to achieve customer acceptance by the subsequent two month customer roll-out at the remaining customer sites. Any unmet critical performance benchmarks after customer roll-out will be handled on a customer assigned priority basis with the appropriate expertise (with our company working on custom and reuse and our company integrating COTS and outsourced software) during the six-month maintenance period.
Deliverable
Stage I -Project Definition, Project Charter, Product & Project Summaries Stage 2 -Management Objectives and Priorities, Assumptions, Dependencies and Constraints, Resource Estimate, Schedule Estimate Stage 3 -Process Model, Organizational Structure, Organizational Interfaces, Project Responsibilities Stage 4 -Technical Methods, Tools, and Techniques, Software Documentation, Project Support Functions, Staffing Plan Stage 5 -Monitoring and Controlling Mechanisms Stage 6 -Work Packages, Dependencies, Resource Requirements, Budget and Resource Allocation, Schedule Stage 7 -Risk Management, Reference Materials, Definitions and Acronyms Stage 8 -Additional Components, Index, Appendices, CDCAAS End
18
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Baseline A specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development and that can be changed only through formal change control procedure. Beta Test In software development, a beta test is the second phase of software testing in which a sampling of the intended audience tries the product out. Beta testing is considered "pre-release testing." Capability Maturity Model (CMM) in software engineering is a model of the maturity of the capability of certain business processes. A maturity model can be described as a structured collection of elements that describe certain aspects of maturity in an organization, and aids in the definition and understanding of an organization's processes. Capability Maturity Model Integration (CMMI) in software engineering and organizational development is a process improvement approach, which wants to provide organizations with the essential elements of effective process improvement. It can be used to guide process improvement across a project, a division, or an entire organization. Constructive Cost Model (COCOMO) is an algorithmic Software Cost Estimation Model developed by Barry Boehm. The model uses a basic regression formula, with parameters that are derived from historical project data and current project characteristics. Configuration Management The process of identifying and defining the deliverable product set in a system, controlling the release and change of these items throughout the system life cycle, recording and reporting the status of product items and change request. Such information typically includes the versions and updates that have been applied to installed software packages and the locations and network addresses of hardware devices. Cost Calculated from the time variable in developing an internal project by multiplying time with the cost of the team members involved. When hiring an independent consultant for a project, cost will typically be determined by the consultant or firms hourly rate multiplied by an estimated time to complete. Commercial Off-The-Shelf (COTS) software or hardware products, which are readymade and available for sale to the general public. Detailed Design Process of refining and expanding the preliminary design of a system or component to the extent that the design is sufficiently complete to be implemented. See also: software development process. Global Positioning System (GPS) is a Global Navigation Satellite System (GNSS) developed by the United States Department of Defense. It is the only fully functional GNSS in the world. It uses a constellation of between 24 and 32 Medium Earth Orbit satellites that transmit precise microwave signals, which enable GPS receivers to determine their current location, the time, and their velocity. 20
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Integration Testing Testing performed to expose faults in the interfaces and in the interaction between integrated components. Kernel The guts of any operating system (OS). The system loads the kernel into the main memory which stays there while other pieces of the OS go in and out of memory. The kernel control s all request for disks, processors, or other resources. Lifecycle - Lifecycle refers to the process used to build the deliverables produced by the project. There are many models for a project lifecycle. Milestone A milestone is a scheduling event that signifies the completion of a major deliverable or a set of related deliverables. A milestone, by definition, has duration of zero and no effort. There is no work associated with a milestone. It is a flag in the work plan to signify that some other work has completed. Usually, a milestone is used as a project checkpoint to validate how the project is progressing. In many cases there is a decision, such as validating that the project is ready to proceed further, that needs to be made at a milestone. Outsource Refers to a company that contracts with another company to provide services that might otherwise be performed by in-house employees. Peer Review A review of a software work product, following defined procedures, by peers of the producers of the product for the purpose of identifying defects and improvements. Preliminary Design Process of analyzing design alternatives and defining the architecture, components, interfaces, and timing and sizing estimates for a system or component. Quality Assurance A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. Requirements A condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard, specification, or other formally imposed documents. Scope Scope is the way you describe the boundaries of the project. It defines what the project will deliver and what it will not deliver. High-level scope is set in your project definition (charter) and includes all of your deliverables and the boundaries of your project. The detailed scope is identified through your business requirements. Security Technical Implementation Guides (STIG) A Department Of Defense security guideline for configuration of COTS software and hardware. Software Life Cycle Process - Software Life Cycle Process is a method and standards for improving and mastering development processes, supporting processes and management processes throughout the software lifecycle. Spiral Model The spiral model is a software development process combining elements of both design and prototyping-in-stages, in an effort to combine advantages 21
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
of top-down and bottom-up concepts. The spiral model is intended for large, expensive and complicated projects. Stakeholder Specific people or groups who have a stake in the outcome of the project are stakeholders. Normally stakeholders are from within the company and may include internal clients, management, employees, administrators, etc. A project can also have external stakeholders, including suppliers, investors, community groups, and government organizations. System Testing Testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. Unit Testing a method of testing that verifies the individual units of source code are working properly. A unit is the smallest testable part of an application. Virtual Private Network (VPN) computer network in which some of the links between nodes are carried by open connections or virtual circuits in some larger network (e.g., the Internet) instead of by physical wires. The link-layer protocols of the virtual network are said to be tunneled through the larger network when this is the case. One common application is secure communications through the public Internet, but a VPN need not have explicit security features, such as authentication or content encryption. VPNs, for example, can be used to separate the traffic of different user communities over an underlying network with strong security features. Waterfall Model The waterfall model is a sequential development process, in which development is seen as flowing steadily downwards (like a waterfall) through the phases of requirements analysis, design, implementation, testing (validation), integration, and maintenance. Work Breakdown Structure A work breakdown structure or WBS is a tree structure, which permits summing of subordinate costs for tasks, materials, etc., into their successively higher level parent tasks, materials, etc. It is a fundamental tool commonly used in project management and systems engineering. Work Package Like a project plan in miniature, a work package is a subset of a project that can be assigned to a specific party for execution.
1.5.2 Acronyms
Acronyms 1M 3GL AAF AAM ACAP ACAP ACM ACT AEXP Term Percentage of integration redone during reuse Third-Generation Programming Language Adaptation Adjustment Factor Adaptation Adjustment Multiplier Analyst Capability Analyst Capability (COCOMO) Association for Computing Machinery Annual Change Traffic Applications Experience (COCOMO) 22
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Acronyms ASLOC AT BPS BRAK C4I CASE CDCAAS CDRL CD-RW CEO CIO CM CM CMM CMMI COCOMO Conops CORBA COTS CPI CPLX CSCI CSTB DATA DBMS DI DM DOCU DRC DSI DVD DVD-R DVD-RW EAF ECP ECR EDS EIA EM ESLOC EV FCIL
Term Adapted Source Lines of Code Automated Translation Bits per Second Breakage. The amount of controlled change allowed in a software development before requirements are unfrozen. (Command/Control/Communications/Computer and Intelligence) Computer Aided Systems Engineering Crime Data Collection, Aggregation, and Assimilation System Contract Data Requirements Lists Compact Disc Rewritable Chief Executive Officer Chief Information Officer Percentage of code modified during reuse Configuration Management Capability Maturity Model Capability Maturity Model lntegration COnstructive COst MOdel Concept of Operations Document Common Object Request Broker Architecture Commercial Off-the-Shelf Cost Performance Index (COCOMO) Product Complexity Computer Software Configuration Hem Computer Science and Telecommunications Board Data Base Size I COCOMO Database Management System Degree of Influence Percentage of design modified during reuse Documentation to match lifecycle needs Direct Resource Code Delivered Source Instructions Digital Versatile Video Disc DVD non-Rewriteable format Read only DVD Rewriteable format Read and Write Effort Adjustment Factor (COCOMO) Engineering Change Proposals Exploration Commitment Review Electronic Data Systems Electronic Industry Alliance Engineering Manager Equivalent Source Lines of Code Earned Value Facilities 23
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Acronyms FCR FLEX FOUL FP GB GFS GHz GNU GOTS GPS GUl H/W HWCl ICASE IDE IEEE INCOSE ISMS ISO IT ITIL KASLOC KEDSI KESLOC KSLOC LAN LEXP LTEX MHz MIS MLS MM MODP MS N1ST NDI NOP OCR ODBC OLE OPF OS PCAP
Term Foundation Commitment Review Development Flexibility (COCOMO) Follow-Ons Unlimited". Function Points Gigabyte Government Furnished Software Gigahertz Gnu not Unix Government Off the Shelf Global Positioning System Graphical User Interface Hardware Hardware Configuration Item Integrated Computer Aided Software Environment Integrated Development Environment Institute of Electrical and Electronics Engineers, Inc. International Council on Systems Engineering Information Security Management System Information Standards Organization Information Technology Information Technology infrastructure Library Thousands of Adapted Source Lines of Code Thousands of Estimated Delivered Source Instructions (COCOMO) Thousands of Equivalent Source Lines of Code Thousands of Source Lines of Code Local Area Network Programming Language Experience (COCOMO) Language and Tool Experience (COCOMO) Megahertz Management Information System Multi-level secure Effort in Programmer-Months Modem Programming Practices Master of Science Degree National Institute of Standards and Technology Non-Development Item New Object Points Operations Commitment Review Open Database Connectivity Object Linking and Embedding Open Process Framework Operating System Programmer Capability (COCOMO) 24
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Acronyms PCON PDA PDlF PERS PERT PEXP PL PM PMAT PMBOK PMI PMP PREC PREX PROD PSM PVOL QA RAM RCPX RELY RESL RFID RFP RUSE RVOL SADT SCAMPI SCED SDLC SECU SEI SITE SLA SLIM SLOC SME SNAFU SOA SOO SOW
Term Personnel Continuity (COCOMO) Personnel Digital Assistant Platform Difficulty Personnel Capability Program Evaluation and Review Technique Platform Experience Product Line Person months, the unit of effort COCOMO II uses to express effort. Process Maturity (COCOMO) Project Management Body of Knowledge Program Management Institute Project Management Plan Precedentedness (COCOMO) Personnel Experience Productivity rate Practical Software and Systems Measurement (Lecture Topics, Oct 20) Platform Volatility Quality Assurance Random Access Memory Product Reliability and Complexity Required Software Reliability Architecture / Risk Resolution (COCOMO) Radio Frequency Identification Request for Proposal Required Reusability (COCOMO) Requirements Volatility Structured Analysis and Design Technique Standard CMMI Assessment Method for Process Required Development Schedule (COCOMO) Software Development Life Cycle Classified Security Application Software Engineering Institute at Carnegie Mellon University Multi-site Development (COCOMO) Service Level Agreement Software Lifecycle Management (Larry Putnam) Source Line of Code Subject Matter Expert Situational Normal: All Fouled Up, or similar Service Oriented Architecture Statement of Operational Objectives Statement of Work 25
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Acronyms SPI SPICE SPMP SQA STOR SU SwSE T&E TEAM TIME TOOL TR TURN UML UNFM USAFIESD USB USC V&V VCR VEXP VIRT VMVH VMVT WAN WBS WITS WP WWW
Term Software Performance Index Software Process Improvement and Capability Determination, the ISO 15504 software capability assessment standard Software Project Management Plan Software Quality Assurance Main Storage Constraint Percentage of reuse effort due to software understanding Software System Engineering Test And Evaluation Team Cohesion (COCOMO) Execution Time Constraint Use of Software Tools (COCOMO) Technical Report Computer Turnaround Time Unified Modeling Language Programmer Unfamiliarity U.S. Air Force Electronic Systems Division Universal Serial Bus University of Southern California Verification and Validation Valuation Commitment Review Virtual Machine Experience Virtual Machine Volatility Virtual Machine Volatility: Host Virtual Machine Volatility: Target Wide Area Network Work Breakdown Structure Worldwide Identity Tracking System Work Package World Wide Web
26
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
2. Project Organization
2.1 Process Model
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Completion of Design Phase Completion of Sub-System Test Phase Completion of System Test Phase Conclusion of Alpha Deployment Gate includes Customer signoff Conclusion of Beta Deployment Gate includes Customer signoff Completion of Deployment Phase Activities Procedures Exit Condition Staff assigned with office space, computers and communications Management team agrees on draft of SPMP Management signs off on estimate and resource baseline and approves final SPMP Team leaders agree on design templates, documentation templates, common functionality features to be documented for version 1.0
Administrating and resource allocation between project Project Kickoff members Meeting Draft of software Project Management Plan (SPMP) Define requirements for Systems Requirements custom, COTS, reuse/NDI, Review and outsourced packages Estimate resources Hold series of internal preliminary design reviews Systems Preliminary Get user agreement on Design Review functional baseline Create and analyze use cases. Analyze and define requirements for custom, COTS, reuse/NDI, and outsourced packages Review requirements for feasibility and testability. Establish and document allocation baseline Produce final SPMP Start developing test plan Complete preliminary design for COTS, Custom and reuse packages. Conduct series of preliminary design reviews Get user agreement on functional baseline 28
Requirement Specification
Review closed and requirement is base lined Customer agrees and signs off baseline requirements. Management provides necessary resources.
Baseline PDD (Preliminary Design Document) Close PDD review Baseline Functional specification User agreement formally signed and archived
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Activities
Procedures Design detailed system features including user interface, data structure, and communication method between modules. Conduct review on Critical Design artifact. Establish baselines for Critical Design Document. Ger user agreement on Critical Design baseline Coding and document custom ,COTS and reuse packages Test COTS, custom, and reuse code, including mature reuse prototype Test outsourced package and test all sub-system functionality Validate components and sub-system functionality.
Exit Condition
Critical Design
Critical design document is created. Close Detail Design review User agreement formally signed and archived
Software is developed and source controlled Results of tests documented and signed off by project management
System test
Results of tests documented and signed off by project management User agreement formally signed and archived Results of tests documented and signed off by project Test full functionality based management on user work flow for all 11 User manual and packages maintenance manual Fix bugs found during documented integration testing and Product baseline established finalize baselines. User agreement formally signed and archived Alpha and beta test Customers sign off on completed deployment Start help desk operation System deployed at user Deploy system at user sites sites Begin user training Scheduled for user training Begin software Help desk and maintenance maintenance activities operational
29
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Activities
Procedures Document best practices, lessons learned, and future marketing opportunities Discontinue automatic license upgrades Close out supporting contracts
Exit Condition Documentation submitted to management and to corporate knowledge database. Close-out licensing and contract information sent to Finance Office Contracting Office, and management personnel and acknowledged
Phase out
2.1.2 Baselines
The essential idea of baselines is that in order to reach a destination it is necessary to know your starting point. In CDCAAS project the following three baselines will be used for each component during its waterfall development lifecycle. Functional Baseline: Describes a systems or items functional characteristics, and the verifications required to demonstrate the achievement of those specified functional characteristics. Allocated Baseline: Describes the functional and interface characteristics for CI (Configuration Item) that are allocated from those of the higher level CI and the verification required to demonstrate achievement of those specified characteristics. Product Baseline: Defines the releasable contents of the project including the application, test case, test results, system and user documentation during the production, fielding/deployment and operational support phases of its life cycle.
2.1.3 Reviews
Various internal reviews and customer reviews will be conducted as scheduled in section 5.5 and review comments and sign off will be documented for each review. Also additional review may be conducted if necessary or based on customer request to identify possible problem in early stage. Those reviews will be Software Requirement Specification Review, Preliminary Design Review, Critical Design Review, Various Test Readiness Review, and Deployment Readiness Review. Other activities also required customer review and sign off will be required before it is executed.
30
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
level requirements that are developed between the User Requirements Document and the Functional Baseline. A Requirements Review will be conducted as a formal review and end of the review customer will sign off and approved the baseline Requirements. Re-review will be conducted if necessary.
31
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
32
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
2.1.3.11 Disposal
Disposal will be practiced according to the company regulations and all sensitive data will be destroyed securely based on approval from upper level management.
33
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The CDCAAS project is conducted by the Application Software Development branch of the Vice President of the Software Division. The organizations structure is shows below:
34
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS is under the Application Software Development branch and has its own structure below:
35
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
36
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The basic CDCAAS project structure consists of a Project Manager, plus an Administrative Support team that serves the entire project. These are considered overhead (i.e. not computed in the calculations of development resources). There are separate organizations for Software Quality and Control, Technical Support, and Analysis, Design and Development Teams. Programmers are distributed over the development of 10 packages out of the 11 applications. The development of GPA Package is outsourced to Ivan Industries. The Analysis, Design and Development Team structure is given below. Variations in team structure appear based on required effort some teams require more developers than others. Variations also appear based on the kind of development. A Team Lead and a Senior Programmer are assigned to work on three different kinds of development (Custom, COTS and Component). COTS and Component Teams applications also get an expert each. Team Structure of the Analysis, Design and Development Team: 3 Team Leaders 1 Requirement Analyst 1 Architect/Designer 1 System Engineer 3 Senior Programmers 15 Programmers 1 COTS Expert 1 Component Expert
Team Structure Software Quality and Control Team: 1 CM Specialist 1 QA Engineer 4 Testers
37
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Team Leaders
Senior Programmers
Requirements Analyst
Programmers
Architecture Designer
COTS Experts
System Engineer
Component Experts
38
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
39
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The diagram below depicts the organizational interfaces related to the CDCAAS Project Manager.
40
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
41
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
42
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
2.4.9 Testers
The testers will be responsible throughout the development cycle for developing and executing test procedures to ensure that the customer's requirements are met by the project packages.
43
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Quality Assurance Specialist Responsible for assuring the software satisfies the test-cases. Provides testing matrices that map each requirement to the test case. Ensures the software is ready to transition through the test cycles and is ready for production.
44
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
Total Project Responsibility Matrix Staff Months Coding and RequireUnit Integration Subtotal Testing COCOMO ments Design Testing 2.64 3.63 1.47 2.22 7.32 2.64 3.63 1.47 2.22 7.32 6.97 3.07 9.59 43.57 3.63 76.22 29.05 1.47 14.53 21.78 2.22 21.78 94.40 7.32 112.53
5.80 14.53 14.53 43.57 2.90 2.90 4.37 7.27 2.90 2.90
64.15 95.19 122.08 138.57 5.12 2.90 8.74 102.38 18.64 12.37
45
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Database Management System Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.28 0.35 0.14 0.21 0.71 0.14 Asst Project 0.28 0.35 0.14 0.21 0.71 0.14 Manager Chief 0.68 4.24 2.83 2.12 9.19 0.99 Programmer Secretary 0.30 0.35 0.14 0.21 0.71 0.14 Systems 0.94 7.42 1.41 2.12 10.95 1.27 Engineer/ Analyst Requirements 1.44 2.83 0.57 0.85 4.24 0.57 Analyst Technical Team 0.51 3.18 1.84 2.33 7.35 1.41 Lead Programmer 0.51 3.18 4.24 2.54 9.96 1.41 Tester 0.76 1.41 0.71 6.36 8.48 4.24 Help Desk 0.21 0.21 0.28 Technician Training 0.28 Specialist Installation 0.43 0.43 0.43 Specialist Documentation 1.27 7.07 0.28 0.64 7.99 0.71 Specialist HR Specialist 0.46 0.71 0.14 0.21 1.06 0.28 Network 0.35 0.14 0.43 0.92 0.28 Support Specialist Procurement 0.17 0.35 0.14 0.21 0.71 0.14 Specialist CM Specialist 0.46 1.77 0.71 1.06 3.53 0.71 QA Specialist 0.46 1.77 0.71 1.06 3.53 0.71 Total Staff 8.52 35.34 14.12 21.21 70.67 14.11 Months % of COCOMO 12% 50% 20% 30% 100% 20% Allocation
Figure 19: Database Management System Package Responsibility Matrix
6.24 9.27 11.88 13.48 0.49 0.28 0.85 9.96 1.80 1.20
46
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
2. Spreadsheet Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal ments Design Testing Testing COCOMO 0.05 0.07 0.03 0.04 0.13 0.05 0.07 0.03 0.04 0.13 0.13 0.06 0.17 0.79 0.07 1.38 0.53 0.03 0.26 0.40 0.04 0.40 1.71 0.13 2.04
0.11 0.26 0.26 0.79 0.05 0.05 0.08 0.13 0.05 0.05
1.17 1.73 2.21 2.51 0.09 0.05 0.16 1.85 0.33 0.22
47
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
3. Requirements & Configurations Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.47 0.66 0.26 0.44 1.36 0.26 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.47 1.27 0.55 1.74 0.66 7.92 0.66 13.85 0.26 5.28 0.26 2.64 0.44 3.96 0.44 3.96 1.36 17.15 1.36 20.45 0.26 1.85 0.26 2.37
1.06 2.64 2.64 7.92 0.53 0.53 0.79 1.32 0.53 0.53
11.66 17.31 22.19 25.17 0.97 0.53 1.58 18.60 3.42 2.24
48
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
4. Secure Communications Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.51 0.71 0.29 0.41 1.41 0.29 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.51 1.34 0.58 1.85 0.71 8.43 0.71 14.73 0.29 5.63 0.29 2.80 0.41 4.21 0.41 4.21 1.41 18.27 1.41 21.75 0.29 1.97 0.29 2.53
1.12 2.80 2.80 8.43 0.56 0.56 0.85 1.41 0.56 0.56
12.42 18.39 23.58 26.79 0.97 0.56 1.70 19.80 3.60 2.41
49
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
5. Graphics Extension Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal Function ments Design Testing Testing COCOMO Project Manager 0.08 0.11 0.04 0.07 0.22 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.08 0.21 0.09 0.29 0.11 1.31 0.11 2.30 0.04 0.87 0.04 0.44 0.07 0.66 0.07 0.66 0.22 2.84 0.22 3.39
0.17 0.44 0.44 1.31 0.09 0.09 0.13 0.22 0.09 0.09
1.93 2.87 3.68 4.17 0.15 0.09 0.26 3.08 0.56 0.37
50
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
6. Word Processor Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal ments Design Testing Testing COCOMO 0.26 0.36 0.16 0.23 0.75 0.26 0.72 0.33 0.98 0.36 4.47 0.36 7.80 0.16 2.97 0.16 1.50 0.23 2.22 0.23 2.22 0.75 9.66 0.75 11.53
0.59 1.50 1.50 4.47 0.29 0.29 0.46 0.75 0.29 0.29
6.53 9.73 12.50 14.24 0.52 0.29 0.91 10.48 1.93 1.27
51
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
7. Project Management Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.04 0.05 0.02 0.03 0.11 0.02 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.04 0.10 0.05 0.14 0.05 0.66 0.05 1.15 0.02 0.44 0.02 0.22 0.03 0.33 0.03 0.33 0.11 1.42 0.11 1.69 0.02 0.15 0.02 0.20
0.09 0.22 0.22 0.66 0.04 0.04 0.07 0.11 0.04 0.04
0.97 1.43 1.84 2.08 0.08 0.04 0.13 1.54 0.28 0.19
52
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
8. GPS Navigation Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing 0.36 0.52 0.20 0.31 1.02 0.20 0.36 0.99 0.44 1.35 0.52 6.14 0.52 10.75 0.20 4.09 0.20 2.05 0.31 3.07 0.31 3.07 1.02 13.30 1.02 15.87 0.20 1.44 0.20 1.85
0.82 2.05 2.05 6.14 0.41 0.41 0.61 1.02 0.41 0.41
9.06 13.43 17.22 19.54 0.72 0.41 1.22 14.45 2.61 1.74
0.20
1.47
53
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
9. Compiler Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing Total 0.09 0.13 0.05 0.08 0.25 0.05 0.39 0.09 0.24 0.11 0.33 0.13 1.48 0.13 2.59 0.05 0.99 0.05 0.50 0.08 0.74 0.08 0.74 0.25 3.21 0.25 3.83 0.05 0.34 0.05 0.44 0.39 3.80 0.41 4.60
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
0.20 0.50 0.50 1.48 0.10 0.10 0.15 0.25 0.10 0.10
2.18 3.24 4.16 4.71 0.18 0.10 0.29 3.48 0.63 0.42
54
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation
10. Debugger and Test Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing 0.28 0.39 0.15 0.23 0.77 0.15 0.28 0.75 0.33 1.03 0.39 4.66 0.39 8.16 0.15 3.11 0.15 1.55 0.23 2.33 0.23 2.33 0.77 10.10 0.77 12.04 0.15 1.09 0.15 1.40
0.62 1.55 1.55 4.66 0.31 0.31 0.46 0.78 0.31 0.31
6.87 10.18 13.06 14.82 0.54 0.31 0.93 10.95 1.99 1.31
0.15
1.12
55
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
11. Electronic Inventory and Tracking Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project 0.21 0.29 0.12 0.17 0.58 0.12 Manager Asst Project 0.21 0.29 0.12 0.17 0.58 0.12 Manager Chief 0.56 3.48 2.32 1.74 7.54 0.81 Programmer Secretary 0.24 0.29 0.12 0.17 0.58 0.12 Systems 0.77 6.09 1.16 1.74 8.99 1.04 Engineer/ Analyst Requirements 1.18 2.32 0.46 0.70 3.48 0.46 Analyst Technical 0.42 2.61 1.51 1.92 6.03 1.16 Team Lead Programmer 0.42 2.61 3.48 2.09 8.18 1.16 Tester 0.63 1.16 0.58 5.22 6.96 3.48 Help Desk 0.17 0.17 0.23 Technician Training 0.23 Specialist Installation 0.35 0.35 0.35 Specialist Documentation 1.04 5.80 0.23 0.52 6.56 0.58 Specialist HR Specialist 0.38 0.58 0.12 0.17 0.87 0.23 Network 0.29 0.12 0.35 0.75 0.23 Support Specialist Procurement 0.14 0.29 0.12 0.17 0.58 0.12 Specialist CM Specialist 0.38 1.45 0.58 0.87 2.90 0.58 QA Specialist 0.38 1.45 0.58 0.87 2.90 0.58 Total Staff 6.96 29.00 11.59 17.39 57.99 11.60 Months % of COCOMO 12% 50% 20% 30% 100% 20% Allocation
Table 19: Electronic Inventory and Tracking Package Responsibility Matrix
5.13 7.61 9.75 11.06 0.41 0.23 0.70 8.18 1.48 0.99
56
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
3. Managerial Process
3.1 Management Objectives & Priorities
Managements objectives and priorities in this project are to deliver a high performance and reliable product that provides a unified tool to access the crime data. The objectives and priorities also include managing all the process of extending the eleven software project tasks. The important packages out of the eleven packages are schedule tracking, resource controlling, risk management and budget controlling to ensure that we will deliver the quality product on schedule and within the budget. The primary priority is to deliver the project on schedule. As crime rate continues to increase, new technology such as CDCAAS using advanced data aggregation and assimilation tools will become increasingly critical for city, county and state organizations to get the crime related cases closed faster than before. To achieve it, the management will organize staff, lead and control the human, financial and technical resources required to accomplish the project within specified time, technical specifications and budget. Management will also be responsible for ensuring all risks are identified and the mitigations plans are developed and implemented.
57
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Spend time and resources in research and development to identify new markets and users to penetrate in with innovative technologies.
58
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Keep the resources up to date and have a vision towards creating product with new distinct features to be ahead of the competitors. Leverage the web in support of geographic distributed development. Spend time and resources in research and development to identify new markets and users to penetrate in with innovative technologies.
59
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
60
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
advanced order of 700 systems. XYZ Notebook computer with CDCAAS Software Application Suite will create a new Benchmark, in the way how a software application can enhances the hardware capabilities and functionality.
62
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
3.2.1 Assumptions
The following assumptions are accepted as true during the development of the CDCAAS project. The Corporation will provides all necessary resources including staff, software Development tools, hardware, budget resources, and staff training. The Universal 2009-B micro processor will be developed by Universal at their microprocessor division and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The Kernel developed by our Universal Microprocessor Division will be backward compatible. Each software packages will be increased by 20% in new features and its size by 20% by approximately Outsourced package will be delivered based on schedule with acceptable quality. The COTS products will be delivered on time with expected functionality and quality. Senior management will support the CDCAAS project fully and timely manner. Capability Maturity Model Integrated (CMMI-DVE) level 3 training will be provided to any staff members by Corporation if necessary.
3.2.2. Dependencies
The following dependencies are exists for our CDCAAS project and those dependencies need to be monitored timely manner to avoid any schedule delays. The Universal 2009-B micro processor will be developed and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The COTS products will be delivered on time with expected functionality and quality. The kernel set of software developed by our system engineers department will be ready by scheduled integration testing. Ivan Industry will deliver the GPS Navigation System by scheduled integration system test with expected functionality. The COTS products will be delivered on time with expected functionality and quality.
3.2.3 Constraints
The following constraints are identified in order to produce effective CDCAAS system. The CDCAAS system must operate on new System XYZ computer system with 63
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
acceptable performance, security, reliability and functionality. The CDCAAS system must be fully compliant with federal regulations and standards for securing data and protecting personal information. The CDCAAS system must offer specified level of audit ability to protect and ensure accuracy of data. The CDCAAS system must provide a high level of concurrency and loadbalancing among all its deployments.
3.2.1 Assumptions
The following assumptions are accepted as true during the development of the CDCAAS project. The Corporation will provides all necessary resources including staff, software Development tools, hardware, budget resources, and staff training. The Universal 2009-B micro processor will be developed by Universal at their microprocessor division and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The Kernel developed by our Universal Microprocessor Division will be backward compatible. Each software packages will be increased by 20% in new features and its size by 20% by approximately Outsourced package will be delivered based on schedule with acceptable quality. The COTS products will be delivered on time with expected functionality and quality. Senior management will support the CDCAAS project fully and timely manner. Capability Maturity Model Integrated (CMMI-DVE) level 3 training will be provided to any staff members by Corporation if necessary.
3.2.2. Dependencies
The following dependencies are exists for our CDCAAS project and those dependencies need to be monitored timely manner to avoid any schedule delays. The Universal 2009-B micro processor will be developed and delivered by February 2009 as scheduled. 64
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The COTS products will be delivered on time with expected functionality and quality. The kernel set of software developed by our system engineers department will be ready by scheduled integration testing. Ivan Industry will deliver the GPS Navigation System by scheduled integration system test with expected functionality. The COTS products will be delivered on time with expected functionality and quality.
3.2.3 Constraints
The following constraints are identified in order to produce effective CDCAAS system. The CDCAAS system must operate on new System XYZ computer system with acceptable performance, security, reliability and functionality. The CDCAAS system must be fully compliant with federal regulations and standards for securing data and protecting personal information. The CDCAAS system must offer specified level of audit ability to protect and ensure accuracy of data. The CDCAAS system must provide a high level of concurrency and loadbalancing among all its deployments.
Maximize positive outcomes and minimize negative outcomes by closely monitoring all accepted risks. Categories that CDCAAS risks fall into are: Personnel Risks Application Development Risks Technical Risks Contractual Risks
65
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
The table below lists the significant CDCAAS Project risks, the category they fall into, and corresponding mitigation strategy. Risk Id R001 Risk Category Personnel Risk Description Shortage of technical personnel (Programmers) Risk Mitigation Strategy Programmers have been assigned to tasks ensuring gaps within subtasks, so they can be reassigned to applications where there may be a shortage of personnel if needed. Shortage of Senior Programmers with matching management skill sets are expected to take on the personnel (Team duties of team leads. Leads) Programmers will be shuffled to cover for the senior programmers tasks. Team members Team leaders and Experts are sent for have limited special training with the vendors or experience with experience developers in COTS and COTS and Reusable Software. Reusable Other programmers will be trained by Component Team leaders and Experts softwares Create manual documents for programmer to look for help and research Kernel is build on Request Kernel to be fully test on our new Universal 2006-G before furnish to processor CDCAAS Universal 2006-G Request fully training of Kernel and and they are new educating technical operating function and first operate of the Universal 2006-G in CDCAAS Create Kernel and Universal 2006-G support service to team members. Many CDCAAS To mitigate this risk, any applications packages will be that use visualization tools, will be using evaluated during regular intervals to visualization tool. determine if any new or existing It is possible that functionality can be categorized as a not all business business logic component and if the logic components identified component can be offered can be readily as a service. identified.
R002
Personnel
R003
Application development
R004
Application development
R005
Technical
66
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Risk Id R006
Risk Description There is a risk that not all presentation components of the CDCAAS packages are identified before the integration. Product Scope of Work
R007
Contractual
R008
Contractual
R009
Contractual
Risk Mitigation Strategy To mitigate this risk, any presentation components are identified after the start of the integration will be evaluated to determine their criticality. The critical components will be added during integration and non-critical components will be omitted or added to the later versions of the CDCAAS. The better defined the requirements, the lower the risks and the greater the opportunity to establish fixed time and cost parameters The higher the concern about risk of failure, the higher the level of integrity required. This will be reflected in cost and schedule increases. Fixed-price contracts place the risk of software development on the shoulders of the supplier. To mitigate this risk, either ensures that the requirements are well defined or insist upon a "time-and-materials" basis for compensation.
3.4.1 Schedule
The strategy for monitor and control of the CDCAAS schedule will be as follows: Create a planned schedule for use as a base line, using COCOMO and bottom-up estimates from the WBS packages. Conduct a biweekly audit session to assess health of the Project.
67
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Identify inch stones to help indicate the progress towards the milestones created by the Project Manager. Conduct weekly internal development team meetings to assess progress and have teams to take self-correcting actions to address any variances. Conduct Monthly official team meetings with the CDCAAS Project Manager, Programming and Analysis chief, and SQA chief during which actual work accomplished is determined and compared against planned work. Based on the comparisons made during the semi-monthly meetings, the CDCAAS PM takes an appropriate action to address variances. Outside vendor activities: These will be monitored more closely than internal activities and corrections must be made more swiftly.
A master schedule for the CDCAAS project is presented in section 5.5. Worked will be tracked at two levels; progress made in completing WBS work packages (micro level), and accomplishment of major milestones (macro).
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Computing the earned value schedule variance (BCWP - BCWS) by taking the difference between the budgeted costs of work performed (BCWP) and budgeted cost of work scheduled (BCWS). Recording which major milestones have been accomplished or missed to date for internal development teams. Recording which major milestones have been accomplished or missed to date by Ivan Industries, as compared with milestone and schedule specified in the subcontract. On- Schedule- Earned value variance for schedule that is within 5%. In this case no corrective actions will be taken. Behind-Schedule- Negative variance in excess of 5%. This will be recorded and will be discussed in the PMs monthly meetings. Ahead of schedule- Positive variance in excess of 5%- This will be recorded and discussed in the monthly meetings. Treating the missed milestone as a negative indicator. Discuss the achievement of the milestone within the next 30 days while keeping the rest of the task on schedule. In case of failure, additional 30 days will be provided. Shifting of resources or taking other corrective actions as necessary.
These decisions to pursue the plan of action will be taken by the Project Manager in consultation with SQA Chief, the Admin Chief and the Procurement Specialist (Ivan Industries).
3.4.2 Budget
The Project Manager is responsible for establishing the budget for the project and must be approved by the Program Manager of New Systems Development, VP SW Division, and Director of Finance. The Project Manager can modify the budget to accomplishment the project, and the budget revisions must be approved by the Program Manager of New Systems Development, VP SW Division and Director of Finance. CDACAAS Budget will be monitored and controlled as follows: Personal effort, financial performance, and equipment resources are measurement of the budget. Reviewing staffs skills and salary to meet with project requirement and accompany with planned budget. Estimating project spending, facilities, equipment and materials are subject to planning budget. Microsoft Project and Microsoft Office will be used to plan the budget, which can be use to capture actual effort and estimate the project completion. Staff cost is reported from Payroll department, equipments and facilities cost is computed using actual cost or forecasting future needs. 69
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Identify Earned Value variances as 5% as a base line to compare actual progress to budget. If the variances are in the ranch of 5% is considered within budget, positive variance in excess of 5% is under budget, and negative variance is over budget. COCOMO and bottom-up estimates from the WBS work package are used in Grant Chart and Millstone Charts to establish a planned budget for use as a base line to establish inch-stone and milestone of the project. Budget is allocated to each project function and tasks to establish cost accounting for the project. Conduct weekly internal development team and semi-monthly official team meeting to review cost, expenditure, inch-stone, and milestone of progress against the budget. All costs relating to the project's activities will be collected, such as time cards, over head report, and purchases order (equipment, software, materials), and compare with budget and discuss with development team members (the details of discussion and reports are the same as in schedule section 3.4.1 above). In details, the weekly internal development team meeting will review and assess of progress on work packages, inch-tone and mile-stone to determine if the project is on, over, or under budget and take action to correct (if needed) the progress. In the semi-monthly meeting, with the presentation of CDCAAS Project Manager, each teams will present in detail their budget status, indicate the progress completed of work package in percentage against the budget plan. The variance of earn value is used to evaluate the actual progress to see if the progress is on budget. If the project is behind, Project Manager will aware the status to development team, review resource of staffs, schedule, and equipment cost, then make appropriate action to correct the performance such as reducing purchases, boosting up working time, and rescheduling the completion time. If the variance is above 15% behind, PM needs to report to Program Director to seek other options. In addition to the meetings, time frames and budget diagrams will be visible to core team to review the project status. The kick off daily team meeting (15 to 30 minutes) is an opportunity to identify issues, risks and any other relevant information, to communicate progress, and also to adjust and balance workloads for team member to meet budget plan.
70
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Responsibility of the Software Quality Assurance Team to ensure that peer and management reviews of the software design are conducted. Responsibility of the Project Manager to ensure that a verifiable process is used to identify all action items generated during the review process. Audit conducted by the Software Quality Assurance Team to ensure that all action items have been addressed.
1. 2. 3. 4. 5.
The code has been tested and meets module specifications, except as noted. That any changes to applicable software module design documents have been identified. That appropriate validation tests have been run. That the functionality of the baseline is documented. That all software design documentation complies with the Software Quality Assurance plan.
3.4.3.5 Inspections
Conduct inspection to review requirement analysis results, requirement traceability matrices, preliminary designs, code, and test plans/cases/procedures. Participants of the inspection will be :Team lead, QA Engineer and Testers Mandatory for the teams to familiarize themselves with the item that is to be inspected prior to the inspection. 71
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Suggestion of a defect log with full description of errors/problems found will be produced and inserted in the appropriate software development folder. Audit of the defect log and the solutions suggested by the Software Quality Assurance team. Documented inspection records will be kept in a safe software development folder.
3.4.4 Productivity
Tasks will be broken down into subtasks and work packages (when appropriate) using the work breakdown structure for an individual teams defined piece of work. Data will be collected based on the amount of effort and time expended working on the component by the in-house programming teams. Data will be reviewed from the outsourcing companys billing documentation based on the work breakdown structure and Earned Value metrics reported Data presented by tem during periodic status reviews will be discussed and evaluated in terms of its effectiveness in helping gauge productivity.
Periodic status reviews will take place weekly or monthly depending on the requirements understanding, coding debugging and prototyping progress of the
72
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
software unit. The status of the software units will be monitored by periodic (bimonthly or monthly) audits and inspections.
Component Database Management 10/28/08 System Spreadsheet 07/04/09 Requirements & 12/27/08 Configuration Management. Secure 10/25/08 Communication Graphics 04/02/09 Presentation Word Processor 09/05/09 Project 01/10/09 Management GPS Navigation 06/04/09 Compiler 10/08/09
73
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Component Debugger & 02/03/10 Test Electronic Inventory & 02/03/10 Tracking
Preliminary Requirements Design Critical Design Subsystem Deployment Review Review Review Test Review Review 04/07/10 04/07/10 06/09/10 06/09/10 10/13/10 10/13/10 12/15/10 12/15/10
Project Teams
Development Organization
Appropriate Metrics Work effort distribution Estimated vs. actual task duration and effort Code covered by unit testing Number of defects found by unit testing Code and design complexity Product size Work effort distribution Requirements status (number approved, implemented, and verified) Percentage of test cases passed Estimated vs. actual duration between major milestones Estimated vs. actual staffing levels Number of defects found by integration and system testing Number of defects found by inspections Defect status Requirements stability Number of tasks planned and completed Released defect levels Product development cycle time Schedule and effort estimating accuracy Reuse effectiveness Planned and actual cost Table 29: Appropriate Metrics
3.4.5 Progress
The schedule, milestones, and reviews are good reference points for measuring progress. The customer will pay attention to the planned milestones and gates to be completed by a scheduled date, the schedule reviews during status meeting will 74
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
indicate if these dates are being met. Management wants to know if CDCAAS keeps budgeted expenditures within the expected variance range from actual expenditures. The team will conduct weekly progress status meetings to assess progress or problems. Once CDCAAS are in the system test phase, daily status meeting will be conducted to provide up-to-date status of the project between team members and also make sure that we can meet the target dates with current progress. The project manager uses the metrics discussed in sub sections to measure the projects progress. Schedules and Budgets both play important roles as does QA. The team delivers not only on time and according to budget, but also in the right way and the right system. By keeping an eye on productivity as well, and taking corrective actions when necessary, the team will ensure that CDCAAS delivers the correct system on time and within budget.
Project schedule will be updated after each status meeting to reflect the most current development progress.
75
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Budgeted Cost of Work Performed (BCWP) will be automatically calculated by Microsoft Project as the % complete activity status is collected and entered into the appropriate fields in the program. Budgeted Cost of Work Scheduled (BCWS) is automatically maintained by Microsoft Project as time passes after a plan baseline is saved within the software. Schedule Variance (SV) is automatically calculated by Microsoft Project as the BCWS and BCWP values change. Each measurement will be retrieved from Microsoft Project and published for monthly/weekly status meeting During the project status meeting project manager will monitor current progress for the following items and after meeting modify schedule with up-to-date status: Which major milestone have been accomplished or missed to date for internal development teams. Which milestones have been accomplished or missed to date by Ivan Industries, as compared with the milestone schedule specified in the subcontract.
3.4.6 Measures
The following Earned Value measurements will be used to monitor project schedule progress: SV: Schedule Variance CV: Cost Variance BCWP: Budgeted Cost Work Performed BCWS: Budgeted Cost Work Scheduled ACWP: Actual Cost Work Performed Compute the Earned Value Schedule Variance using the following formula: SV = BCWP - BCWS Compute the Earned Value Cost Variance using the following formula: CV = BCWP - ACWP
Design
Work completed WP done to date
Int. Test
Work completed WP done to date
System Test
Work completed WP done to date
User Operations
Help desk use Hours support used to date
76
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Design
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Complexity of design Design complete to date Design complete/ Design expected Over 5% less than expected Design completed Design completed to date Design completed / Design expected 5% less complete then expected Schedule projection met
Int. Test
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Defects detected Defects reported to date Defects / SLOC
System Test
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Defects detected Defects reported to date Defects / SLOC
User Operations
Actual/Expect ed use Over 5% more use then expected Expended dollars Actual cost to date BCWP-ACWP CV over 5% BCWP Defects reported Defects reported to date Defects / SLOC
SV over 5% BCWP Expended dollars Actual cost to date BCWP-ACWP CV over 5% BCWP Requirement Volatility Requirements Changed to date New Reqmts/Orig Reqmts Over 5% new requirements
Derived
Decision Indicator
Over 5 defects per 1000 SLOC SLOC completed SLOC complete to date SLOC complete/ SLOC expected 5% less complete then expected Schedule projection met
Reqmts completed Reqmts completed to date Reqmts completed / Reqmts expected 5% less complete then expected
Over 5 defects per 1000 SLOC Rework required Recoding required to date Amt. new code / Original code Over 10% redone
Over 5 defects per 1000 SLOC Rework required Recoding required to date Amt. new code / Original code Over 10% redone
Over 5 defects per 1000 SLOC Trouble Tickets received Tickets received to date Tickets received/ time Over 10 tickets received per day Customer complaints
Sites installed
Derived
Decision Indicator
Progress Attribute
Over 10% less installed in amount of time then expected Customer satisfaction
77
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Design
Design completed to date Design completed / Design expected 5% less complete then expected
Int. Test
Defects reported to date Defects / SLOC
System Test
Defects reported to date Defects / SLOC
User Operations
Customer complaints reported to date Customer complaints reported/tim e Customer complaints increasing over time
Derived
Decision Indicator
Attribute Characteristic
Attribute collected
Schedule
Work completed (Work packages completed to date) Development teams Semimonthly Team status reports and timesheets
Budget/ Cost
Expended dollars (Expended dollars to date)
Quality
Defects detected (Defects detected to date)
Productivity
Rework required (Rework required to date)
Progress
Customer satisfaction
Source of data
Development Development Development teams and testing and testing teams teams SemiSemiWeekly monthly monthly Timesheets Defect Defect and reporting reporting Purchase tool, peer tool, source Requests and quality control reviews system change reports
Customer surveys, Help desk Monthly Customer satisfaction survey reports, Complaint and resolution turnaround time reports Customer Satisfaction Management Reports
78
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Attribute Characteristic
Who verifies the attribute
Schedule
QA Manager
Budget/ Cost
Assistant PM, Payroll and Accounting Department Manager
Quality
Software Quality Assurance Team, Testing Team Manager
Productivity
Software Quality Assurance Team, Testing Team Manager, Development Team Lead
Progress
Project Manager, Assistant PM, Support Team Manager
3.5.2 Training
Training includes all activities designed to enhance the competencies of the project team members. Training is both formal and informal. We follow training methods like classroom, online, computer based, on the job training from another project team member, mentoring and coaching. If project team members lack necessary management or technical skills, such skills can be developed as part of the project work. Scheduled training takes place as stated in the staffing management plan. Unplanned training takes place as a result of observation, conversation, and project performance appraisals conducted during the controlling process of managing the project team.
3.5.3 Retaining
Retention is as important as recruiting. Perhaps it is even more important due to the amount of time and training given to our team. We invested the time to build the 79
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
team, now create the strategy to keep them and to improve their performance. Loyal teams make the difference during tough times. There are, undoubtedly, hundreds of ideas and techniques for retaining key employees and team members. This list should be considered a few great ideas. Consider our companys strategic goals and consider the people we have in place who are attempting to meet those goals. The some ideas followed are here: Communicate - Keep our team members informed. Stay visible - Team members feel more confident when they know the leader is available for support. Feedback - Provide it often and ask for it. Keep an open mind & consider their suggestions. Show appreciation for good work - Reward our key players as often as possible. People generally won't work for people who just don't care for them. Realizing the fact, we truly care about the staffs, their career and their goals. Provide Dual- Career Ladders Often providing promotions along with salary increases and better working conditions.
80
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
81
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
82
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
83
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
84
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
85
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
86
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
87
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
88
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
89
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
90
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
91
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
92
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS CORPORATION Senior Programmers CDCAAS Corp. is seeking Senior Programmers in C, C++, and Java with experience in full life cycle for a new product development in Fairfax area. Job Description: Will be responsible for performing requirements analysis, design, coding, unit testing, integration testing of software, and personal configuration management in a CMMI. Required Qualifications: -Minimum Education: BS in CS or SW Engineering. -Minimum Experience: Candidates must have 5 years experience in developing C, C++, Visual Basic, JAVA, and experience with GUI Design; requires low-level Windows systems programming/device driver development, and intimate knowledge of Windows internals; have knowledge of relational database concepts (i.e. ERDs) and significant experience programming in .NET Technology; experience in using COTS tools and Reusable Software a plus. Required Skills: - GUI design and program experience. - Proficiency in C/C++, Java, UML, and HTML - Experience in database programming, MS Project, MS Office, and VISIO. - Experience with COTS product and Reusable software is a plus. - Understanding of network management and correlation products is highly desirable. - Excellent written and oral communication skills and ability to work with people at every level Join CDCAAS Corp to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS Corp, P.O. BOX 1234, Fairfax, Virginia. Out of area Candidates will be considered for this opportunity.
93
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS CORPORATION Business Analyst CDCAAS Corp. is looking for a Business Analyst to support our professional services and product management initiatives. Responsibilities: Interact with business and technical teams to support growing customer base Apply in-depth knowledge of software application to respond to inquiries from customers and internal staff Interact with customers who are users of application software. Track and monitor status and progress of customer issues Collect and analyze customer feedback. Assist with review and provide input to business requirements and analyze product features/functions Work with engineering to understand technical issues relating to product and business requirements Assist with development and execution of application testing Facilitate the management of client needs, tasks, and expectations Provide input to and assist in maintenance of user product documentation, including functional specifications, process flows, and business rules Qualifications Excellent verbal and written communications skills Strong organizational skills Ability to learn quickly and be effective in a fast-paced environment Strong analytical and problem solving skills A great deal of initiative and team spirit Knowledge of rule engine and mortgage or financial industry preferred 1 - 3 years of relevant job experience in an IT product development or BA or BS degree (Prefer Accounting, Information System, Economics, or Business)
94
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS CORPORATION Components Reusable Expert CDCAAS is seeking Expert in Components Reusable implementation with experience in full life cycle for a new product development in Fairfax area. Job Description: Works with Software development team and Project Manager to define design requirements and additional system requirement, design or prototype application software interface, design Common Module and reusable components. Required Qualifications: -Minimum Education: BS in CS or SW Engineering -Minimum Experience: Must demonstrate 4 years experience in the analysis and validation of reusable software and hardware components, experience with software design and implementation, expertise in .NET*, C/C++, Java, and XML, ability to work comfortably and effectively with operational users, program managers, and software engineers. -Required Skills: - GUI design and program experience. - Experience in database programming, MS Office, MS Project, and WBS Chart Pro - Experience with COTS product and Reusable software - Significant experience programming in .NET Programmable Interop Assemblies (NPI). Join CDCAAS to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS Corp, P.O. BOX 1234, Fairfax, Virginia. Out of area Candidates will be considered for this opportunity. is a plus.
95
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CDCAAS CORPORATION Junior Programmers CDCAAS is seeking Junior Programmers for new development effort in Fairfax area. Job Description: Works with Software development team and Project Manager to develop code and deploy a full life cycle for a new product development. -Required Qualifications: Associate degree CS with 2 year experience or BS in CS or SW Engineering, Project and Detail-oriented, self-reliant/self-starter, team player, creative problem solving skills. -Required Skills: Strong knowledge of MS Office, Visio, MS Outlook, HTML, and UML Knowledge of .Net Technology, Rational Rose, Visual Basic is a plus. Join CDCAAS corp. to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS CORP, P.O. BOX 1234, Fairfax, VA. Out of area candidates will be considered for this opportunity.
Job Title
Project Manager Asst Project Manager Business Analyst Architecture Designer
Level (I-VII)
IV III II III
Degree/Major
MS Computer Science MS Software Engineering MS MIS BS Computer Engineering
Experience (years)
10 7 5 5
Salary($)
110,000 87,000 68,000 68,000
Bid Price
330,000 260,000 204,000 204,000
Labor Category
50 44 178 172
Source
Transfer Transfer Advertise Transfer
96
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Job Title
Software Engineering Team Lead(3) Senior Programmer Analyst(3) COTS Expert Computer Expert Junior Programmers(15) QA Engineers(2) Tester(4) CM Specialist Training Specialists Installation Specialist Document Specialist System Engineers Network Support Specialist Quality Assurance Specialist
Level (I-VII)
III
Degree/Major
MS Software System Engineering BS Computer Science BS Computer Science BS Computer Science BS Computer Science MS Information Systems BS Computer Science BS Information Systems BS Information Systems BS Electrical Engineering BS Management Information Systems BS Computer Science BS Electrical Engineering Management Information Systems
Experience (years)
6
Salary($)
84,000
Bid Price
252,000
Labor Category
167
Source
Transfer/ Advertise Advertise /Transfer Advertise Advertise Advertise /Transfer Transfer Transfer Transfer Transfer Transfer Transfer Transfer Transfer Advertise
5 4 4 2 8 2 6
3 2 2 1 2 2 2
228,000 150,000 150,000 135,000 330,000 180,000 252,000 150,000 141,000 120,000 204,000 174,000
190 120 121 191 100 152 155 43 49 54 123 175 175
2 5 5 7
1 2 2 4
97
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
4. Technical Process
4.1 Methods, Tools & Techniques
The following wire-structure describes applicable tools, methods, and standards with the understanding that the local interpretations and deviations from those standards may apply during each phase and for each type of software application (Custom, Reuse, COTS and Outsourced). The actual methods and standards will vary as deliverables proceed through each phase of the development life cycle for each component over the entire software life cycle. The following outline serves as a governing guide of currently understood best practices and standards. 1. 1.1. 1.1.1. 1.1.2. 1.2. 1.2.1. 1.2.2. 1.2.3. 1.2.4. 1.3. 1.3.1. 1.3.2. 1.3.3. 2. 2.1. 2.1.1. 2.1.2. 2.1.3. 2.1.4. 2.2. 2.2.1. 2.2.2. 2.2.3. 2.2.4. 2.2.5. 2.3. 2.3.1. 2.3.2. Requirements Processes Applicable Tools Visual Paradigm for UML-Requirements Management Package Visual Paradigm for UML-UML Support Package Methods UML Use Cases Throw-away prototypes Interactive Storyboards Standards IEEE 830, Recommended Practices for Software Requirements Specifications IEEE 1062, Recommended Practices for Software Acquisition IEEE 1420, Standard for Information Technology-Software Reuse Design Tools Visual Paradigm for UML-UML Support Package Visual Paradigm for UML-Database Modeling Package Visual Paradigm for UML-Object-Relational Mapping Package CASE Tools Methods Object Oriented Design Information Hiding Relational Database Design UML XML Standards IEEE 1016, Recommended Practice for Software Design Descriptions European Computer Manufacturer's Association (ECMA) TR/55, Reference Model for Frameworks of Software Engineering Environments 98
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
2.3.3. 2.3.4. 3. 3.1. 3.1.1. 3.1.2. 3.1.3. 3.2. 3.2.1. 3.2.2. 3.3. 3.3.1. 3.3.2. 3.3.3. 3.3.4. 4. 4.1. 4.1.1. 4.1.2. 4.1.3. 4.2. 4.2.1. 4.2.2. 4.2.3. 4.3. 4.3.1. 4.3.2. 4.3.3. 5. 5.1. 5.1.1. 5.1.2. 5.2. 5.2.1. 5.2.2. 5.3. 5.3.1. 5.3.2.
IEEE 1348, Recommended Practices for Adoption of CASE Tools IEEE 1420, Guide for Information Technology-Software Reuse Concept of Operation for Interoperating Reuse Libraries Code Tools JAVA Eclipse IDE for JAVA EE Developers Visual Paradigm Teamwork Server Methods Pair Programming Evolutionary Prototyping Standards Code Conventions for the Java Programming Language (SUN Microsystems, JAVA code) IEEE 1028, Standard for Software Reviews and Audits IEEE 730, Standard for Software Quality Assurance Plans IEEE 1298, Standard for Software Quality Management Systems Unit Test Tools Cactus Visual Paradigm Teamwork Server Bugzilla Methods Bottom-up testing Testing with sample data Testing with expected and high volume data sets Standards IEEE 1008, Standard for Software Unit Testing IEEE 1012, Standard for Software Verification and Validation Plans IEEE 829, Standard for Software Test Documentation Integration Test Tools Visual Paradigm Teamwork Server Bugzilla Methods Top-down testing White Box testing Standards Open Process Framework (OPF) Testing IEEE 829, Standard for Software Test Documentation 99
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
6. 6.1. 6.1.1. 6.1.2. 6.2. 6.2.1. 6.2.2. 6.2.3. 6.2.4. 6.2.5. 6.2.6. 6.2.7. 6.2.8. 6.3. 6.3.1. 7. 7.1. 7.1.1. 7.1.2. 7.2. 7.2.1. 7.2.2. 7.2.3. 7.2.4. 7.3. 7.3.1. 7.3.2. 8. 8.1. 8.1.1. 8.1.2. 8.1.3. 8.2. 8.2.1. 8.2.2. 8.2.3. 8.2.4. 8.2.5. 8.2.6. 8.2.7.
System Test Tools Visual Paradigm Teamwork Server Bugzilla Methods Black Box Testing Thread Testing White Box Testing Limited-Scale Integration Testing Full-Scale Integration Testing Test Plan Test Execution and Issue Resolution Test Documentation Standards IEEE 829, Standard for Software Test Documentation Deployment Tools Symantec Endpoint Antivirus Kaspersky Antivirus 2009 Methods Beta Testing Software Integration Full Deployment Baseline Standards IEEE 828, Standard for Software Configuration Management Plans IEEE 1063, Standard for Software User Documentation Maintenance Tools Liberum CDCAAS Bugzilla Methods Remote monitoring Help Desk Service Level Agreement (SLA) Automatic updates Error tracking Baselines Software Integration 100
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
8.2.8. 8.2.9. 8.3. 8.3.1. 8.3.2. 8.3.3. 8.3.4. 8.3.5. 9. 9.1. 9.1.1. 9.1.2. 9.1.3. 9.1.4. 9.2. 9.2.1. 9.2.2. 9.2.3. 9.2.4. 9.2.5. 9.2.6. 9.3. 9.3.1. 9.3.2. 9.3.3.
Security Practices Information Technology Infrastructure Library (ITIL) Standards IEEE 1219, Standard for Software Maintenance IEEE 1042, Standard for Software Configuration Management ISO 17799, Security Standard BS 7799, Guidelines for Information Security Risk Management ISO 27001, Information Security ManagementSpecification with Guidance for Use Disposal Tools Life Span Technology Recycling Retire-IT Computer Disposal Intel Students Recycling Used Technology Program Computer for Schools Affiliate Methods Certified Data Destruction Hard Drive Erasure Equipment Inventory, Inspection, and Testing Comprehensive Reporting Responsible Recycling Remarketing Standards Applicable State and Local Regulations (e.g.State of California's Electronic Waste Recycling Act of 2003, any CRT landfill bans) International Association of Electronics Recyclers (IAER) Electronic Industries Alliance (EIA) Reuse and Recycle
101
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
If document is approved from the review, documentation specialist assembles the final document by incorporate review comments. If document is not approved the author revise document and re-review the document. If the document is to be delivered to the customer, the customer reviews will be conducted. If the customer rejects the document, the document will be sent to the author for revise. If the customer approves the document with comments, the author will incorporate comments. Document is stored, labeled and version controlled. The document is officially released for distribution. The following table describes the software documentation necessary to effectively communicate requirements and responsibilities to entire project team. The document entities include list of activities and tasks, organization of tasks by execution time, type of skills required to produce the document, assigned personnel to the task, and the document process flow document.
Written By Marketing
P A G E S 120
Distribution Senior Corporate Management, Senior Project Manager, Customer Senior PM, PM, Chief Architect Senior PM, PM, Chief Architect, Team Leader Senior PM, PM, Chief Architect, Team Leader
Price $72,000
System Design Document Software Project Managemen t Plan Soft-ware Requireme nts Specification
DI-MCCR80534
Soft-ware Team
150
After SPMP
Chief Architect
Essential to Project
$22,500
10/25/2008
Project Manager
190
Essential to Project
$28,500
09/05/2008
DI-MCCR80025
230
After SPMP
Essential to Project
$34,500
11/25/2008
102
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
P A G E S 175
Distribution Senior PM, PM, Chief Architect, Team Leader PM, Chief Architect, Team Leader
Price $26,250
DI-MCCR80027
Software Team
225
After SRS
Essential to Project
$33,750
3/25/2009
DI-MCCR80012
Software Team
350
After SRS
PM, Chief Architect, Team Leader Chief Architect, Team Leader Chief Architect, Team Leader
Essential to Project
$52,500
09/25/2009
DI-MCCR80014
Software Team
275
After SRS
Team Leader
Essential to Project
$55,000
10/25/2009
DI-MCCR80013
Software Team
45
After Every Software Build After Software test are performed After Detailed Design
Team Leader
Essential to Project
$6,750
1/25/2011
DI-MCCR80017
Product Assurance
125
Team Leader
Company Important
$25,000
1/25/2011
DI-MCCR80018
290
Senior PM, PM, Chief Architect, Customer Senior PM, PM, Chief Architect, Team Leader, Customer Senior PM, PM, Team Leader, Customer PM, Chief Architect, Team Leader
Essential to Customer
$174,000
12/27/2010
DI-MCCR80019
180
Essential to Customer
$108,000
12/27/2010
DI-MCCR80021
200
Chief Architect
Essential to Customer
$40,000
12/27/2010
DI-E-3128
Product Assurance
12
Chief Architect
Company Important
$2,400
12/27/2010
103
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Name Total
Written By
P A G E S 2555
Time to Initiate
Approval Authority
Distribution
Publication Type
Price $681,150
Miles-tone date
104
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
105
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
- Configuration Change Control The change control process is applied to specified configuration items, components, and baselines under CM control. CM will not accept changes to controlled documents without appropriate documentation identifying the changes to be made and the authorization for those changes. Proposed changes will be documented, analyzed for cost and schedule impact, and, if approved, prioritized and scheduled for implementation. The CCB, chaired by the Chief Scientist, reviews the completed DTI, and approves it for closure or rework. Minutes from the CCB meetings are forwarded to the customer for their review. - Configuration Status Accounting CM will utilize VSS along with a Microsoft Excel Spreadsheets Database, to track and report information about product deliverables and CM activities. Standard reports are issued according to the product schedule, and may also be requested on an as-needed basis by management. The generated reports are stored as controlled items, but are not under formal change control. - Configuration Audits In addition to QA audits of the CM process, the CMA periodically reviews the structure and facilities of the CM repositories, and the CM Information System, to verify that the contents are correct and complete. CM audits software baselines to verify that they conform to the documentation that defines them. The results of the audits are documented, and any identified action items are recorded and tracked to closure. - Build Management Working closely with the Product Technical Lead, the build release engineer develops and maintains the scripts necessary to create automated nightly builds. Records of the builds are maintained in sufficient detail that the status of the system is known at all times, and recovery of previous versions is possible. - Release Management CM is responsible for the preparation and external release of controlled items in accordance with the approved CM procedures. Items cannot be externally released without the consent of CM. CM is also responsible for internal releases of items under its control. An internal release makes the item (e.g., documents, software) available to the appropriate product team member for update. - Environment Maintenance
106
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
CM, working with the Network support staff, is responsible for creating and maintaining the products base-lined environments (e.g., development, testing). Change control polices will be applied to all base-lined environments. - Disposition of Data at Close Out As permitted by contractual agreements, CM will maintain product materials for the purposes of reference and reuse after product closeout.
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
emailed comments. QA is invited to all formal Customer Reviews and ensures client comments are addressed appropriately and in a timely manner. In the event that further discussion is required, additional meetings or electronic exchanges will be initiated. CDCAAS team will conduct at least one peer review of the product prior to the required Customer Review. - Action Item Management Action items can be generated during any formal or informal meeting (e.g., Peer Review, Customer Review, Status, Product Management Review (PMR), In-Process Review (IPR), Technical Exchange, Staff, All-Hands, etc.), and are documented in a designated section of the meeting minutes form. - Corrective Action Escalation Process When problems persist and/or corrective actions continue to be inadequate, QA will escalate the issue to the appropriate level of management (e.g., Program Manager, Group QA Manager) via the independent reporting structure. QA has the option of presenting the escalated concerns verbally or in writing. QA will maintain a Corrective Action Escalation Log to track escalated items to closure. - Final Delivery Inspection QA, in conjunction with Configuration Management, verifies that the product has successfully completed all development processes and requirements specified, that packaging and delivery preparations have been completed according to procedure, and that the product is complete and accurate, and includes identification of any open items or product shortages.
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
109
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
110
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
111
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
112
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
113
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
ID #
WBS ID 1 1.1 1.2 1.3 1.4 1.5 1.6 1.6.1 1.6.1.1 1.6.1.1.1 1.6.1.1.2 1.6.1.1.3 1.6.1.1.4 1.6.1.1.5 1.6.1.2 1.6.1.2.1 1.6.1.2.2 1.6.1.2.3 1.6.1.2.4 1.6.1.2.5 1.6.1.3 1.6.1.3.1 1.6.1.3.2 1.6.1.3.3 1.6.1.3.4 1.6.1.3.5 1.6.1.4 1.6.1.4.1 1.6.1.4.2 1.6.1.4.3 1.6.1.4.4 1.6.1.4.5 1.6.2 1.6.2.1 1.6.2.1.1 1.6.2.1.2 1.6.2.1.3 1.6.3 1.6.3.1 1.6.3.1.1 1.6.3.1.2 1.6.3.1.3 1.6.3.1.4 1.6.3.2
Task Name CDCAAS System Project Management Technical Management Quality Assurance Requirements Management Configuration Management Software System CDCAAS Custom Development Requirements and Configuration Management Identify Candidates(Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Secure Communication Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Word Processor Identify Candidates(Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Debugger & Test Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Outsourced Custom Development GPS Navigation Identity Candidate Coding and Unit Testing Integration testing Software Reuse Development Database Management System Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing Compiler
114
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
ID #
WBS ID 1.6.3.2.1 1.6.3.2.2 1.6.3.2.3 1.6.3.2.4 1.6.3.3 1.6.3.3.1 1.6.3.3.2 1.6.3.3.3 1.6.3.3.4 1.6.4 1.6.4.1 1.6.4.1.1 1.6.4.1.2 1.6.4.1.3 1.6.4.1.4 1.6.4.1.5 1.6.4.2 1.6.4.2.1 1.6.4.2.2 1.6.4.2.3 1.6.4.2.4 1.6.4.2.5 1.6.4.3 1.6.4.3.1 1.6.4.3.2 1.6.4.3.3 1.6.4.3.4 1.6.4.3.5 1.7 1.8 1.9 1.10 1.11 1.12 1.13 1.14
Task Name Identify Candidates(Req) Design Customization Coding and Unit Testing Integration Testing Electronic Inventory and Tracking Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing COTS Development Spreadsheet application Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Graphics Presentation Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Project Management Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Hardware Kernel Documentation Software support Environment Alpha Test Beta Test System Deployment Support
Table 28: Work Breakdown Structure
115
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
116
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Feature description: The improved Spreadsheet Package will add new features to the existing MS Excel software and allow better management of reading and writing to text files, improved security features. Activity Description: Create and review detailed design of integration of New Spreadsheet into existing Spreadsheet Application. Estimated duration: 15 days. Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer, 1 Component Expert. Skills: Visual Basic, MS Access 2003. Tools: Spreadsheet App, Visual Basic, Access 2003. Travel: None. Work Product: Detailed Design of Spreadsheet Customization. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Design of Spreadsheet Customization. Completion criteria: Inspection of functionality and approval of selected product decision document. IMPLEMENTATION Personal assigned: Kelly Romeo, Wanda Briggs, Teddy Brown, Charles Rosenburg, Jim Rosenberg. Starting date: ___12/07/09___Completion Date: ___01/29/09____ Cost (budgeted/actual):____$473,887.44_____ Legacy comments: None
117
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
5.2 Dependencies
118
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
119
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
120
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
121
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
122
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
123
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
124
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
125
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
126
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
127
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
128
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
129
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
130
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
131
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
132
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
133
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
134
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
135
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
136
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
137
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
138
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
139
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
140
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
141
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
142
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
143
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
144
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
145
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
146
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Package CDCAAS Database Management System Data Processing CDCAAS Spreadsheet Data Processing CDCAAS Requirements and configuration Management Data Processing CDCAAS Secure Communication Information Center CDCAAS Graphics Presentation Web CDCAAS Word Processor Data Processing CDCAAS Project Management Data Processing CDCAAS GPS Navigation Guidance, Navigation & Control System CDCAAS Compiler Tools CDCAAS Debugger & Test Environment CDCAAS Electronic Inventory and Tracking Command & Control
Table 29: Resource Estimate-Method 1
KEDSI/ KSLOC 400 70 350 358 198 135 43 77 205 125 600
Method 2 Divide estimated size of Application Storage by four to obtain the size estimate. Size KEDSI (MB) /KSLOC 300 75 176 44 156 39 260 65 188 47 248 62 120 30 408 102
Package Database Management System Spreadsheet Requirements and configuration Management Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation
147
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Package CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking
Table 30: Resource Estimate-Method 2
Method 3 Modified Delphi Technique: The Delphi Technique was designed to gather input from participants without requiring them to work face-to-face. Often, the process is used to find consensus among experts who have differing views and perspectives. The Delphi Technique enables group problem-solving using an iterative process of problem definition and discussion, feedback, and revisions. The Modified Delphi Technique described here uses mail or email to gather information, provide feedback, and report conclusions. Here a panel of 3 experts with 8 years experience each was requested to give an estimate based on their experience. Expert 1 (KEDSI) 210 40 82 170 150 80 110 290 60 70 140 Expert 2 (KEDSI) 350 90 130 150 110 50 180 200 100 80 150 Expert 3 (KEDSI) 310 140 190 100 97 110 112 200 80 140 190 Average (KEDSI) 290 90 134 140 119 80 134 230 80 90 160
Package CDCAAS Database Management System CDCAAS Spreadsheet CDCAAS Requirements and configuration Management CDCAAS Secure Communication CDCAAS Graphics Presentation CDCAAS Word Processor CDCAAS Project Management CDCAAS GPS Navigation CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking
COCOMO: With average of above 3 estimates ISE's as in input to COCOMO II model, approximation for cost, effort & schedule has been generated:
148
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
149
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
150
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
151
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
152
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
153
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
154
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Computed Final Size (CFS) in thousands of estimated delivered source instructions (KEDSI) is derived from ISE through a development class-based adjustment, resulting in an estimate which more accurately represents the amount of work required by the project team to engineer the package. For example, a custom software package is expected to require more lines of code than a COTS product. To generate the CFS for a custom package, the ISE is multiplied by 20%. For a reuse/non-developmental item (NDI) package, the ISE is multiplied by 6%. For a COTS package, the ISE is multiplied by 4% since very little new coding is required. A weighted average is generated with the modal figure given four times the weight of the high and low estimates. The formulas are as follows: COTS: ISE * 4% Reuse/non-developmental item: ISE * 6% Custom: ISE * 20% CFS Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value ) / 6 Computed Effort (CE) in staff months is generated using three different methods: (1) the Constructive Cost Model (COCOMO), (2) the Walston-Felix model, and (3) the Boyston model. Again, the modal result is weighted four times heavier than the high and low values. The formulae are shown as follows:
155
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
COCOMO: Effort = 3.6 * (CFS (expected value) ^ 1.2) Walston-Felix: Effort = 5.2 * (CFS (expected value) ^ 0.91) Boyston: Effort = 5.78 + 3.11 * (CFS (expected value ) ^ 1.0 ) CE Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value) / 6 Computed Duration (CD) in months is generated using the COCOMO, Walston-Felix, and Boyston models: COCOMO: Duration = 2.5 * (CE (expected value) ^ 0.32) Walston-Felix: Duration = 4.1 * (CE (expected value) ^ 0.36) Boyston: Duration = 2.15 * (CE (expected value) ^ 0.33) CD Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value) / 6 To meet scheduling targets some packages will require extra effort. A schedule compression value is applied to the Computed Duration, which in turn generates the Effort Adjustment Factor (EAF), Adjusted Effort (AE) in staff months, Productivity PEDSI in staff months, and average staff for each software package. The calculations required are listed below. Schedule Compression Value Compressed Duration (Months) Effort Adjustment Factor (EAF) Adjusted Effort, AE (Staff Months) Productivity PEDSU/Staff Month Average Staff %SCV = A number less than or equal to 14 CD = 1 - ( %SCV / 100) * Duration ( expected value ) EAF = ( %SCV / 100 + 1 ) AE = ( CE ( expected value ) * EAF) P = CFS ( expected value ) / AE) AS = AF / CD
A spreadsheet detailing the entire series of calculations is attached as Appendix I. A spreadsheet which allows the spreadsheet formulas to be examined is attached as Appendix II.
156
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
158
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
ID 1.6.3.1 1.6.3.1.1 1.6.3.1.2 1.6.3.1.3 1.6.3.1.4 1.6.3.2 1.6.3.2.1 1.6.3.2.2 1.6.3.2.3 1.6.3.2.4 1.6.3.3 1.6.3.3.1 1.6.3.3.2 1.6.3.3.3 1.6.3.3.4 1.6.4 1.6.4.1 1.6.4.1.1 1.6.4.1.2 1.6.4.1.3 1.6.4.1.4 1.6.4.1.5 1.6.4.2 1.6.4.2.1 1.6.4.2.2 1.6.4.2.3 1.6.4.2.4 1.6.4.2.5 1.6.4.3 1.6.4.3.1 1.6.4.3.2 1.6.4.3.3 1.6.4.3.4 1.6.4.3.5 1.7 1.8 1.9 1.10 1.11 1.12
Task Name Database Management System Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing Compiler Identify Candidates(Req) Design Customization Coding and Unit Testing Integration Testing Electronic Inventory and Tracking Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing COTS Development Spreadsheet application Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Graphics Presentation Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Project Management Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Hardware Kernel Documentation Software support Environment Alpha Test Beta Test 160
Fixed Cost $2,985,839.13 $358,300.63 $716,601.36 $1,167,685.49 $743,251.65 $1,315,009.16 $157,801.10 $315,602.20 $512,853.57 $328,752.29 $2,449,435.62 $293,932.27 $587,864.55 $955,279.89 $612,358.91 $3,947,800.30 $1,316,356.25 $210,619.81 $157,962.00 $315,925.00 $473,887.44 $157,962.00 $1,315,393.59 $78,923.62 $236,770.85 $315,694.46 $552,465.31 $131,539.36 $1,316,050.46 $78,963.03 $236,889.08 $315,852.11 $552,741.19 $131,605.05 $1,050,000.00 $8,610,540.45 $681,150.00 $128,517.84 $208,063.00 $160,178.00
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
ID 1.13 1.14
161
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
5.5 Schedule
162
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
163
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Figure 92: CDCAAS Electronic Inventory & Tracking/Custom Development Detailed Schedule
164
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
165
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
167
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Additional Components
1.1. Subcontracting Process
Our approach in the subcontracting process is based upon:
Creating a Management Integrated Product Team (IPT) that includes subcontract personnel to provide a streamlined, timely response to all Task Orders Assigning a distinct Task Order IPT for each task order assigned to a subcontractor and implemented under this program Identifying risk areas, with subcontractors, at the beginning or in the early stages of individual task orders, implementing Risk Mitigation procedures to minimize the risk, and inspecting the work in process to assure mitigation efforts are adequate. Providing timely corrective actions for problem areas and monitoring those actions to prevent Recurrence. Using Lessons Learned from other programs to maximize the potential for success for each task order.
Core Subcontractors: They are top-notch provider of training and simulation products and
services to Government or Industry with in-place quality and performance standards, having
172
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Vendors: Quality vendor of COTS training and simulation products and/or services those are directly applicable to CDCAAS Team. Once all proposed subcontractor candidates had been identified and evaluated, the core candidates were rated based upon their core technical capability, past performance and strengths as compared to the supplemental capability needed CDCAAS Program. Systems Engineering and Applied Management, Hardware/Software Development and Integration, Instruction, and Courseware experience were major elements of the technical capabilities evaluation decision.
Nondisclosure Agreements, which protect the exchange of sensitive information and expedite the technical interaction that is required at this phase, were issued. Upon selection of the core candidates, a two-tier Integrated Product Team (IPT) organization was established to respond to the CDCAAS requirements and manage the CDCAAS Program after contract award.
Have the required technical capability. Have strong management in-place with the necessary experience and tools to effectively manage their specialty area. Have existing quality processes and procedures that are ISO compliant. Have excellent past performance in the relevant specialty area. Be responsive. 173
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Once the Task Order IPT has completed its evaluation, it recommends to the TSAT Management IPT to add this newly approved subcontractor/vendor to the team for this new Task Order.
174
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Traffic encryption
To prevent sensitive data from being intercepted as it travels over the wire, traffic must be encrypted. Secure Sockets Layer (SSL) has become the most common method for creating an encrypted connection between client and server, and for authenticating both the server and client machines. There are three communication protocols between the Collaborative Services and back-end servers that need to be configured for SSL: HTTP, DIIOP and LDAP.
User authentication
Obviously, we want to make sure that only authorized people are accessing our systems. Standard authentication methods include user ID and password, SSL certificates exchanged between client and server, and a user's listing in a corporate directory using LDAP, the industry standard for Internet and intranet-based directories.
175
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
SSL can be implemented via signed certificates from a certificate authority such as VeriSign. To enable SSL in CDCAAS, we have to configure SSL for the HTTP server, as well as the Application Server plug-in for the Web server. SSL also must be enabled for the LDAP connections in Application Server. Application Server has additional authentication tools, including the Credential Vault for use by portlets that need to access back-end system. The Credential Vault provides a place for portlets to access user credentials such as password and SSL certificate after the user has logged on, providing single sign-on for the user.
Function authorization
Not every user should have access to every resource or function. Good security requires that different user groups be granted different levels of access to corporate systems. Application Server accomplishes this via the J2EE security mechanisms, which include security roles. Developers can create generic security roles for various departments or types of employees, and provide those roles with access to the specific resources and functions they require. For instance, accounting and human resources employees may be given the ability to run a payroll query, whereas the marketing group cannot. Generic roles can then be mapped to actual users later.
An MIS Goal
60 Days
176
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Test Plan
In order to know what to test, we need to have a Test Plan. Usually, this follows the functions defined in the Specifications. We will use a spreadsheet for building the Test Plan. The columns for the Test Plan are:
ID : The feature number assigned in the Specifications Description : A description of what the feature is supposed to do. Build # : The build number that the feature was implemented in. Developer : The name of the developer of the feature. Tested : A checkbox that the tester can check to show that they have tested it. Pass/Fail : To show if the feature passed testing and works as described. Notes : Brief summary used by the tester to describe why a feature did not pass.
Using this spreadsheet, we will be then able to sit down with the tester(s) and developer(s) and review the status of the development phase. This provides a good indicator for determining if the project is on schedule or not. Regarding the Build #, each build of the CDCAAS application that gets released for testing should be incremented by a Build #. In conjunction with the use of the Test Plan, is the use of a Issue Tracking System. This provides a shared environment where the developers, testers, managers, and even the clients can enter and track the progress of Issues (bugs, requests, etc.) that are related to the project. When reporting Issues, we are using a Severity Category to determine how critical an Issue is. The general categories are:
Category 1 : This issue causes a severe crash of the application, a crash to desktop, an application freeze, or corruption of the data. Nothing should be delivered to the client with Category 1 Issues. Category 2 : This issue causes undesired functionality or application errors. Can also include data miscalculation (not corrupted, but just not correct). Decision will be taken at run-time whether to deliver application with a Category 2 Issue. Perhaps it's an error that requires a sequence of steps that rarely, if ever, will occur in the production environment. Category 3 : This issue is a suggestion, minor UI change, or issues that don't affect the functionality of the application.
177
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Alpha Complete
This is a major milestone during the development process. This flags that all of the targeted features and functionality have been implemented. They just might not work correctly, but at least the code is in place.
Beta Complete
When both of us are comfortable with the status of all the features of the application, and have agreed on any minor anomalies that might still be in the application, then we will reach the Beta Complete phase of the project. At this point we can now put the finishing touches on the project - Documentation, Implementation, Conversion, Training, and Post Implementation Support.
178
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
179
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Index
A Acronyms, 18, 19, 22 advance orders, iii Alpha Test, 19, 32, 116, 161 analysis, v, 10, 22, 71, 94, 96, 104 Application Software Cost, vii Assumptions, v, vi, 18, 62, 63, 64 B baseline, 2, 28, 29, 31, 32, 71, 76, 106, 110 Baseline, 19, 28, 30, 31, 71, 101 Baselines, 30, 31, 101, 106 beta tests, ix budget, 57, 58, 60, 61, 63, 64, 67, 69, 70, 75 Budget, 15, 18, 69, 70, 78, 111, 160 budgets, 15 C Capability Maturity Model, 20, 23, 59, 61, 63, 64 COCOMO, vii, 18, 19, 20, 22, 23, 24, 25, 26, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 67, 70, 149, 156, 157 Code Review, 71 Commercial-Off-The-Shelf, ix Compile Link and Runtime Package, 12 Compiler, vii, x, 1, 16, 54, 73, 82, 115, 148, 149, 161 Configuration Management, 1, 11, 16, 20, 23, 41, 43, 73, 101, 106, 109, 115, 160, 1 Constraints, v, vi, 14, 18, 62, 63, 64, 65 Corrective Action, 72, 73, 109 cost, iv, v, ix, 20, 57, 58, 67, 68, 69, 70, 74, 77, 107, 108, 149 COST, vii COTS, vi, ix, 1, 15, 16, 17, 18, 20, 21, 23, 28, 29, 37, 59, 61, 63, 64, 65, 66, 180 73, 94, 96, 99, 116, 156, 161, 1 Critical Design Review, 30, 31, 73 Critical Path Method, 12 D database, 10, 11, 30, 94, 96, 2 Database Management System, vii, x, 1, 15, 23, 46, 73, 115, 148, 149, 161 databases, v, 10, 11, 14, 15 Debugger & Test, vii, x, 2, 74, 115, 148, 149, 160 Definitions, 18, 19 Deliverables, v, vi Dependencies, ii, 18, 62, 63, 64, 119 Deployment and Support, 29 Design Walk-through, 70 detailed design, ix, 31, 117, 118, 1, 3, 4, 5, 6, 7, 8, 9 DEVELOPMENT ENVIRONMENT, 13 Documentation, 15, 17, 23, 30, 41, 43, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 72, 100, 101, 116, 161, 179 E Earned Value, 23, 70, 72, 75, 76 Electronic Inventory & Tracking, vii, 74, 9 Electronic Inventory and Tracking, x, 2, 11, 12, 16, 56, 116, 148, 149, 161 Electronic Inventory and Tracking Package, 11, 12, 56 encryption, 22, 176 Estimated duration, 117, 118, 1, 2, 3, 4, 5, 6, 7, 8, 9 Evolution of the Software Project Management Plan, 18 External Reviews, 71 F Federal Bureau of Investigation, ix
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
G Gantt charts, 12 GLOBAL POSITIONING SYSTEM GPS, 6 goals, iii, v, 57, 58, 59, 61, 80, 108 Goals, 57, 58, 59, 61 GPS Navigation, vii, x, 1, 11, 12, 16, 53, 63, 65, 73, 115, 148, 149, 160 GRAPHIC PRESENTATION, 3 Graphics Presentation, vii, x, 1, 11, 16, 73, 116, 148, 149, 161 Graphics Presentation Package, 11 H hardware, ix, 12, 14, 15, 16, 20, 21, 32, 43, 60, 61, 63, 64, 73, 96 Hardware, vii, viii, ix, 7, 13, 24, 116, 161 I Initial Cost Estimate, viii INITIAL COST ESTIMATE, vii inspection, 71, 72 Inspection, 102, 109, 117, 118, 1, 2, 3, 4, 5, 6, 7, 8, 9 inspections, 73, 74 Inspections, 71 integration testing, 29, 63, 65, 94, 110 Integration Testing, 20, 101, 115, 116, 160, 161 Interpol, ix Ivan Industries, vi, ix, 1, 16, 37, 69, 76 J Java, 12, 94, 96, 100, 1, 2, 4, 7, 8 JAVA, 94, 100 K Ken Nidiffer, i Kernel, vii, 21, 63, 64, 66, 116, 161 L life cycle, 20, 30, 59, 61, 70, 94, 96, 97, 99 Life Cycle, 21, 25, 76
M Management Objectives, 18, 57, 58 Management Priorities, 60, 61 management process, 42 management processes, 21 measures, 31 Measures, 76, 79 methods, ix, 79, 99, 146, 156 Methods, 18, 99, 100, 101, 102 Milestone, 21 milestones, 27, 68, 69, 74, 76 Milestones, 27 monitoring, 14, 30, 65, 67, 71, 101 Monitoring, 18, 67 N Navigation Package, 11, 12, 53 O Obtaining, 79 Organization Boundaries, 40 Organizational Structure, 18, 33 Outsource, 18, 21 outsourced, vi, ix, 18, 28, 29, 37 Outsourced, 1, 16, 63, 64, 73, 99, 115, 160 P peer review, 27, 108, 109 Peer Review, 21, 109 Peer Reviews, 108 Phase out, 30 Phase Out, 33 preliminary design, 20, 28 Preliminary Design, 21, 28, 30, 31, 73, 109, 115, 160 PRIORITIES AND CONSTRAINTS, 14 Process Model, ii, 18, 27 Product Evaluations, 72 PRODUCT OVERVIEW, 10 PRODUCT SUMMARY, 10 productivity, 67, 68, 72, 73, 75 Productivity, ii, 25, 72, 73, 74, 77, 78, 147, 157 181
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Program Evaluation and Review Technique, 12 progress, 14, 31, 67, 68, 70, 72, 74, 75, 76, 95 Progress, 68, 72, 74, 75, 77, 78 PROJECT DELIVERABLES, 15 Project Description, ii, 1 Project Management, i, iii, iv, vii, x, 1, 11, 16, 18, 19, 25, 26, 28, 52, 73, 104, 115, 116, 148, 149, 160, 161 PROJECT MANAGEMENT, 5 Project Organization, 27, 36 Project Overview, ii, 1 Project Progress, 75 Project Responsibilities, 18, 42 Project Sponsor, v Project Support Functions, 18, 106 Q quality, iii, v, 42, 57, 58, 63, 64, 65, 67, 68, 78, 108 Quality, 21, 25, 26, 37, 41, 44, 70, 71, 72, 77, 78, 79, 100, 106, 108, 115, 160 Quality Assurance plan, 71 R Reqm'ts & Config. Mgmt., vii requirements, iii, ix, 15, 18, 21, 22, 23, 27, 28, 30, 32, 42, 43, 60, 62, 67, 68, 72, 77, 94, 95, 96, 97, 103, 109, 110, 146 Requirements, vi, x, 1, 11, 16, 17, 18, 21, 23, 25, 27, 28, 30, 31, 42, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 73, 74, 76, 99, 103, 115, 117, 118, 133, 148, 149, 160, 1, 3, 4, 5 Responsibility Matrix, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56 Retaining, 79 reusable, vi, ix, 96 reuse, 18, 22, 23, 26, 28, 29, 59, 61, 108, 156 Reuse, 1, 16, 74, 99, 100, 102, 115, 156, 160 182
risk, 10, 14, 15, 57, 58, 65, 66, 67, 108 Risk, 14, 18, 19, 25, 65, 66, 67, 102 S scope, iv, 21, 42, 68 Scope, 15, 21, 67 SCOPE, ix Secure Communication, vii, x, 1, 11, 16, 73, 115, 148, 149, 160 Secure Communication Package, 11 software documentation, 103 Software Documentation, 18, 102 Software Requirement Specification Review, 30 Spreadsheet, vii, x, 1, 11, 15, 47, 73, 78, 116, 117, 118, 148, 149, 1 Spreadsheet Package, 11, 47, 118 Staffing, vi, 18, 79, 81, 82, 83, 84, 86, 87, 88, 89, 91, 93 Staffing Plan, 18, 79 support environment, 95 Susan Smith, v system testing, 43, 74, 110 System Testing, 22, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56 System XYZ, iii, v, vi, viii, ix, 19, 63, 65 T Technical Process, 99 techniques, 67, 80 Techniques, 18, 72, 99 time, iv, ix, 1, 2, 10, 11, 12, 13, 15, 20, 57, 58, 59, 61, 63, 64, 65, 67, 68, 70, 72, 74, 75, 76, 77, 78, 79, 103, 110 tools, 57, 58, 59, 61, 63, 64, 66, 94, 99, 4 Total Effort, vii training, vi, ix, 1, 16, 29, 32, 43, 60, 62, 63, 64, 66, 79, 94, 96, 97 Training, 17, 41, 43, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 79, 98, 177, 179
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
U unit testing, ix, 31, 42, 74, 94 Unit testing, 43 Unit Testing, 22, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 100, 115, 116, 160, 161 User authentication, 176 V Validation Plans, 100 Verification, 26, 72, 100, 106, 109
W WORD PROCESSING, 4 Word Processing Package, 11 Word Processor, vii, x, 1, 16, 51, 73, 115, 148, 149, 160 work breakdown structure, 22, 72 Work Breakdown Structure, 22, 26, 111 Work breakdown Structures, 12 Work Package Specifications, 117 work packages, 68, 70, 72 Work packages, 78 Work Packages, 18, 111
183
Appendix I
Detailed Resource Estimate Spreadsheet intermediate calculations) attached. (23 columns plus
CDCAAS
Col# 1 2 3
Appendix I
4 5
Development Class
9/22/08
5a 6 7 8 9aCalc 9bCalc
Your initial size of application in thousand of estimated delivered source instructions (ISE)
Computed value of the final size of application in thousand of estimated delivered source instructions for the class of development
# 1 2 3 4 5 6 7 8 9 10 11
Package Name Database Management System Spreadsheet Reqm'ts & Config. Mgmt. Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation Compiler Debugger & Test Electronic Inventory & Tracking Total Alpha Factor Exponent Allocation of CUSTOM/REUSE/COTS Applications Development Class Code (1, 2, or 3) 1 2 3 Total
90 KEDSI/ KEDSI 1 400 70 350 358 198 135 43 77 205 125 600 2,561
(1= custom, Compressi 125 2=reuse/N on % for KEDSI/ target--for KEDSI KEDSI KEDSI DI, KEDSI 3 3=COTS) reference method 1 method 2 method 2
2 3 1 1 3 1 3 1 2 1 2
12 14 4 -
24.00 2.80 70.00 71.60 7.92 27.00 1.72 15.40 12.30 25.00 36.00 293.74
4.50 1.76 7.80 13.00 1.88 12.40 1.20 20.40 1.38 11.00 5.70 81.02
17.40 3.60 26.80 28.00 4.76 16.00 5.36 46.00 4.80 18.00 9.60 180.32
Max 24.00 3.60 70.00 71.60 7.92 27.00 5.36 46.00 12.30 25.00 36.00 328.78
Middle value 17.40 2.80 26.80 28.00 4.76 16.00 1.72 20.40 4.80 18.00 9.60 150.28
Variance
1 2 3
0 0 0
Page 1 of 3
CDCAAS
9cCalc 9 9wits 9 - 9wits 10 11
Appendix I
12 9aCalc 9bCalc 9cCalc 13
9/22/08
14
Computed value of duratio Computed Computed Computed Computed value of value of value of value of effort effort using effort using expected WITS report using the the Walston the Boyston Calculated vs value for values for COCOMO Felix model, model, in WITS for Size, e.g., comparison, comparison, model in staff in staff staff KEDSI 20.5 KEDSI KEDSI months months months 16.35 29.43 (13.08) 102.93 66.12 56.63 2.76 8.67 (5.91) 12.17 13.10 14.36 30.83 35.48 (4.65) 220.36 117.76 101.67 32.77 13.27 19.50 237.04 124.46 107.68 4.81 7.98 (3.17) 23.69 21.70 20.73 17.23 5.02 12.21 109.63 69.36 59.38 2.24 35 (32.76) 9.48 10.83 12.75 23.83 20.22 3.61 161.78 93.16 79.90 5.48 7.97 (2.49) 27.72 24.45 22.82 18.00 32.83 (14.83) 115.51 72.16 61.76 13.35 37.67 (24.32) 80.70 54.98 47.30 167.65 233.54 (65.89) 1,101.01 668.09 584.98 5.78 3.11 1
Min 4.50 1.76 7.80 13.00 1.88 12.40 1.20 15.40 1.38 11.00 5.70 76.02
Max 102.93 14.36 220.36 237.04 23.69 109.63 12.75 161.78 27.72 115.51 80.70 1,106.47
Middle value 66.12 13.10 117.76 124.46 21.70 69.36 10.83 93.16 24.45 72.16 54.98 668.09
Min 56.63 12.17 101.67 107.68 20.73 59.38 9.48 79.90 22.82 61.76 47.30 579.52
Computed value of expected effort, e.g., 88.2 staff months 70.67 13.16 132.18 140.43 21.87 74.41 10.93 102.39 24.72 77.65 57.99 726.39
Using the COCOMO model 9.77 5.70 11.93 12.17 6.71 9.93 5.37 11.00 6.98 10.06 9.17
3.6 1.2
5.2 0.91
2.5 0.32
Page 2 of 3
CDCAAS
15 16 17 18A For reference only
Appendix I
18 19 20 21 22
9/22/08
23
Using the Walston Felix model 18.99 10.37 23.79 24.31 12.45 19.34 9.70 21.70 13.01 19.64 17.68
Using the Boyston model 8.76 5.03 10.78 10.99 5.95 8.91 4.73 9.90 6.20 9.04 8.21
Computed value of expected value for duration, in staff months 15.75 8.70 19.64 20.07 10.41 16.04 8.15 17.95 10.87 16.28 14.69
Computed Value for Effort value of schedule Value for adjustment Adjusted compression schedule Value of factor (EAF) Effort (AE) percentage compression duration due to due to given target percentage, given percent schedule schedule below, cannot cannot be of schedule compres- compression be greater greater than compression sion, e.g., in Staff than 14% 14% in months 1.05 Months 0 0 15.75 1.00 70.67 0 0 8.70 1.00 13.16 8 12 17.29 1.12 148.04 10 14 17.26 1.14 160.09 0 0 10.41 1.00 21.87 0 0 16.04 1.00 74.41 0 0 8.15 1.00 10.93 0 4 17.23 1.04 106.48 0 0 10.87 1.00 24.72 0 0 16.28 1.00 77.65 0 0 14.69 1.00 57.99
Computed value of Productivity (P), e.g. 20,500/92.6 =221 EDIS/Staff Months 231.36 209.80 208.27 204.68 219.78 231.61 205.03 223.82 221.64 231.80 230.23
Computed value of Average Staff (AS), e.g. 92.6/11.4", 8.1 technical members; =(AE)/(19) 4.49 1.51 8.56 9.28 2.10 4.64 1.34 6.18 2.27 4.77 3.95
4.1 0.36
2.15 0.33
Page 3 of 3
Appendix II
Detailed Resource Estimate Spreadsheet with formulas revealed-attached.
CDCAAS
Col# 1 2
Appendix II
3 4 5
Development Class
9/22/08
5a
Your initial size of application in thousand of estimated delivered source instructions (ISE)
# 1 2 3 4 5 6 7 8 9 10 11
Package Name Database Management System Spreadsheet Reqm'ts & Config. Mgmt. Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation Compiler Debugger & Test Electronic Inventory & Tracking Total Alpha Factor Exponent Allocation of CUSTOM/REUSE/COTS Applications
90 KEDSI/ KEDSI 1 400 70 350 358 198 135 43 77 205 125 600 =SUM(C4:C14)
125 KEDSI/ KEDSI 3 290 90 134 140 119 80 134 230 80 90 160 =SUM(E4:E14)
2 3 1 1 3 1 3 1 2 1 2
Compression % for target--for reference =AC4 =AC5 =AC6 =AC7 =AC8 =AC9 =AC10 =AC11 =AC12 =AC13 =AC14
Actual Allocation from table Development Class Code (1, 2, or 3) above 1 =COUNTIF(F$4:$F14,B26) Custom 2 =COUNTIF(F$4:$F15,B27) Reuse 3 =COUNTIF(F$4:$F16,B28) COTS Total =SUM(C26:C28)
1 2 3
Page 1 of 8
CDCAAS
6
Appendix II
7
9/22/08
Computed value of the final size of application in thousand of estimated delivered source instructions for the class of development
KEDSI method 1 =IF($F4=1,C4*0.2,IF($F4=2,C4*0.06,C4*0.04)) =IF($F5=1,C5*0.2,IF($F5=2,C5*0.06,C5*0.04)) =IF($F6=1,C6*0.2,IF($F6=2,C6*0.06,C6*0.04)) =IF($F7=1,C7*0.2,IF($F7=2,C7*0.06,C7*0.04)) =IF($F8=1,C8*0.2,IF($F8=2,C8*0.06,C8*0.04)) =IF($F9=1,C9*0.2,IF($F9=2,C9*0.06,C9*0.04)) =IF($F10=1,C10*0.2,IF($F10=2,C10*0.06,C10*0.04)) =IF($F11=1,C11*0.2,IF($F11=2,C11*0.06,C11*0.04)) =IF($F12=1,C12*0.2,IF($F12=2,C12*0.06,C12*0.04)) =IF($F13=1,C13*0.2,IF($F13=2,C13*0.06,C13*0.04)) =IF($F14=1,C14*0.2,IF($F14=2,C14*0.06,C14*0.04)) =SUM(H4:H14)
KEDSI method 2 =IF($F4=1,D4*0.2,IF($F4=2,D4*0.06,D4*0.04)) =IF($F5=1,D5*0.2,IF($F5=2,D5*0.06,D5*0.04)) =IF($F6=1,D6*0.2,IF($F6=2,D6*0.06,D6*0.04)) =IF($F7=1,D7*0.2,IF($F7=2,D7*0.06,D7*0.04)) =IF($F8=1,D8*0.2,IF($F8=2,D8*0.06,D8*0.04)) =IF($F9=1,D9*0.2,IF($F9=2,D9*0.06,D9*0.04)) =IF($F10=1,D10*0.2,IF($F10=2,D10*0.06,D10*0.04)) =IF($F11=1,D11*0.2,IF($F11=2,D11*0.06,D11*0.04)) =IF($F12=1,D12*0.2,IF($F12=2,D12*0.06,D12*0.04)) =IF($F13=1,D13*0.2,IF($F13=2,D13*0.06,D13*0.04)) =IF($F14=1,D14*0.2,IF($F14=2,D14*0.06,D14*0.04)) =SUM(I4:I14)
Variance
Page 2 of 8
CDCAAS
8
Appendix II
9aCalc 9bCalc 9cCalc
9/22/08
KEDSI method 2 =IF($F4=1,E4*0.2,IF($F4=2,E4*0.06,E4*0.04)) =IF($F5=1,E5*0.2,IF($F5=2,E5*0.06,E5*0.04)) =IF($F6=1,E6*0.2,IF($F6=2,E6*0.06,E6*0.04)) =IF($F7=1,E7*0.2,IF($F7=2,E7*0.06,E7*0.04)) =IF($F8=1,E8*0.2,IF($F8=2,E8*0.06,E8*0.04)) =IF($F9=1,E9*0.2,IF($F9=2,E9*0.06,E9*0.04)) =IF($F10=1,E10*0.2,IF($F10=2,E10*0.06,E10*0.04)) =IF($F11=1,E11*0.2,IF($F11=2,E11*0.06,E11*0.04)) =IF($F12=1,E12*0.2,IF($F12=2,E12*0.06,E12*0.04)) =IF($F13=1,E13*0.2,IF($F13=2,E13*0.06,E13*0.04)) =IF($F14=1,E14*0.2,IF($F14=2,E14*0.06,E14*0.04)) =SUM(J4:J14)
Max =MAX(H4:J4) =MAX(H5:J5) =MAX(H6:J6) =MAX(H7:J7) =MAX(H8:J8) =MAX(H9:J9) =MAX(H10:J10) =MAX(H11:J11) =MAX(H12:J12) =MAX(H13:J13) =MAX(H14:J14) =SUM(K4:K14)
Middle value =SUM($H4:$J4)-$K4-$M4 =SUM($H5:$J5)-$K5-$M5 =SUM($H6:$J6)-$K6-$M6 =SUM($H7:$J7)-$K7-$M7 =SUM($H8:$J8)-$K8-$M8 =SUM($H9:$J9)-$K9-$M9 =SUM($H10:$J10)-$K10-$M10 =SUM($H11:$J11)-$K11-$M11 =SUM($H12:$J12)-$K12-$M12 =SUM($H13:$J13)-$K13-$M13 =SUM($H14:$J14)-$K14-$M14 =SUM(L4:L14)
Min =MIN(H4:J4) =MIN(H5:J5) =MIN(H6:J6) =MIN(H7:J7) =MIN(H8:J8) =MIN(H9:J9) =MIN(H10:J10) =MIN(H11:J11) =MIN(H12:J12) =MIN(H13:J13) =MIN(H14:J14) =SUM(M4:M14)
Page 3 of 8
CDCAAS
9
Appendix II
9wits 9 - 9wits 10
9/22/08
Computed value of expected value for Size, e.g., 20.5 KEDSI =CHOOSE(N$23,(L4*4+K4+M4)/6,O4) =CHOOSE(N$23,(L5*4+K5+M5)/6,O5) =CHOOSE(N$23,(L6*4+K6+M6)/6,O6) =CHOOSE(N$23,(L7*4+K7+M7)/6,O7) =CHOOSE(N$23,(L8*4+K8+M8)/6,O8) =CHOOSE(N$23,(L9*4+K9+M9)/6,O9) =CHOOSE(N$23,(L10*4+K10+M10)/6,O10) =CHOOSE(N$23,(L11*4+K11+M11)/6,O11) =CHOOSE(N$23,(L12*4+K12+M12)/6,O12) =CHOOSE(N$23,(L13*4+K13+M13)/6,O13) =CHOOSE(N$23,(L14*4+K14+M14)/6,O14) =SUM(N4:N14)
WITS report values for comparison, KEDSI 29.43 8.67 35.48 13.27 7.98 5.02 35 20.22 7.97 32.83 37.67 =SUM(O4:O14)
=N4-O4 =N5-O5 =N6-O6 =N7-O7 =N8-O8 =N9-O9 =N10-O10 =N11-O11 =N12-O12 =N13-O13 =N14-O14 =SUM(P4:P14)
Computed value of effort using the COCOMO model in staff months =Q$17+Q$18*$N4^Q$19 =Q$17+Q$18*$N5^Q$19 =Q$17+Q$18*$N6^Q$19 =Q$17+Q$18*$N7^Q$19 =Q$17+Q$18*$N8^Q$19 =Q$17+Q$18*$N9^Q$19 =Q$17+Q$18*$N10^Q$19 =Q$17+Q$18*$N11^Q$19 =Q$17+Q$18*$N12^Q$19 =Q$17+Q$18*$N13^Q$19 =Q$17+Q$18*$N14^Q$19 =SUM(Q4:Q14)
3.6 1.2
Page 4 of 8
CDCAAS
11 12
Appendix II
9aCalc 9bCalc
9/22/08
9cCalc
Computed value of effort using the Walston Felix model, in staff months =R$17+R$18*$N4^R$19 =R$17+R$18*$N5^R$19 =R$17+R$18*$N6^R$19 =R$17+R$18*$N7^R$19 =R$17+R$18*$N8^R$19 =R$17+R$18*$N9^R$19 =R$17+R$18*$N10^R$19 =R$17+R$18*$N11^R$19 =R$17+R$18*$N12^R$19 =R$17+R$18*$N13^R$19 =R$17+R$18*$N14^R$19 =SUM(R4:R14)
Computed value of effort using the Boyston model, in staff months =S$17+S$18*$N4^S$19 =S$17+S$18*$N5^S$19 =S$17+S$18*$N6^S$19 =S$17+S$18*$N7^S$19 =S$17+S$18*$N8^S$19 =S$17+S$18*$N9^S$19 =S$17+S$18*$N10^S$19 =S$17+S$18*$N11^S$19 =S$17+S$18*$N12^S$19 =S$17+S$18*$N13^S$19 =S$17+S$18*$N14^S$19 =SUM(S4:S14) 5.78 3.11 1
Max =MAX(Q4:S4) =MAX(Q5:S5) =MAX(Q6:S6) =MAX(Q7:S7) =MAX(Q8:S8) =MAX(Q9:S9) =MAX(Q10:S10) =MAX(Q11:S11) =MAX(Q12:S12) =MAX(Q13:S13) =MAX(Q14:S14) =SUM(T4:T14)
Middle value =SUM($Q4:$S4)-$T4-$V4 =SUM($Q5:$S5)-$T5-$V5 =SUM($Q6:$S6)-$T6-$V6 =SUM($Q7:$S7)-$T7-$V7 =SUM($Q8:$S8)-$T8-$V8 =SUM($Q9:$S9)-$T9-$V9 =SUM($Q10:$S10)-$T10-$V10 =SUM($Q11:$S11)-$T11-$V11 =SUM($Q12:$S12)-$T12-$V12 =SUM($Q13:$S13)-$T13-$V13 =SUM($Q14:$S14)-$T14-$V14 =SUM(U4:U14)
Min =MIN(Q4:S4) =MIN(Q5:S5) =MIN(Q6:S6) =MIN(Q7:S7) =MIN(Q8:S8) =MIN(Q9:S9) =MIN(Q10:S10) =MIN(Q11:S11) =MIN(Q12:S12) =MIN(Q13:S13) =MIN(Q14:S14) =SUM(V4:V14)
5.2 0.91
Page 5 of 8
CDCAAS
13 14 15
Appendix II
16 17
9/22/08
Computed value of expected effort, e.g., 88.2 staff months =(U4*4+T4+V4)/6 =(U5*4+T5+V5)/6 =(U6*4+T6+V6)/6 =(U7*4+T7+V7)/6 =(U8*4+T8+V8)/6 =(U9*4+T9+V9)/6 =(U10*4+T10+V10)/6 =(U11*4+T11+V11)/6 =(U12*4+T12+V12)/6 =(U13*4+T13+V13)/6 =(U14*4+T14+V14)/6 =SUM(W4:W14)
Using the COCOMO model =X$17+X$18*$W4^X$19 =X$17+X$18*$W5^X$19 =X$17+X$18*$W6^X$19 =X$17+X$18*$W7^X$19 =X$17+X$18*$W8^X$19 =X$17+X$18*$W9^X$19 =X$17+X$18*$W10^X$19 =X$17+X$18*$W11^X$19 =X$17+X$18*$W12^X$19 =X$17+X$18*$W13^X$19 =X$17+X$18*$W14^X$19
Using the Walston Felix model =Y$17+Y$18*$W4^Y$19 =Y$17+Y$18*$W5^Y$19 =Y$17+Y$18*$W6^Y$19 =Y$17+Y$18*$W7^Y$19 =Y$17+Y$18*$W8^Y$19 =Y$17+Y$18*$W9^Y$19 =Y$17+Y$18*$W10^Y$19 =Y$17+Y$18*$W11^Y$19 =Y$17+Y$18*$W12^Y$19 =Y$17+Y$18*$W13^Y$19 =Y$17+Y$18*$W14^Y$19
Using the Boyston model =Z$17+Z$18*$W4^Z$19 =Z$17+Z$18*$W5^Z$19 =Z$17+Z$18*$W6^Z$19 =Z$17+Z$18*$W7^Z$19 =Z$17+Z$18*$W8^Z$19 =Z$17+Z$18*$W9^Z$19 =Z$17+Z$18*$W10^Z$19 =Z$17+Z$18*$W11^Z$19 =Z$17+Z$18*$W12^Z$19 =Z$17+Z$18*$W13^Z$19 =Z$17+Z$18*$W14^Z$19
Computed value of expected value for duration, in staff months =(Y4*4+X4+Z4)/6 =(Y5*4+X5+Z5)/6 =(Y6*4+X6+Z6)/6 =(Y7*4+X7+Z7)/6 =(Y8*4+X8+Z8)/6 =(Y9*4+X9+Z9)/6 =(Y10*4+X10+Z10)/6 =(Y11*4+X11+Z11)/6 =(Y12*4+X12+Z12)/6 =(Y13*4+X13+Z13)/6 =(Y14*4+X14+Z14)/6
2.5 0.32
4.1 0.36
2.15 0.33
Page 6 of 8
CDCAAS
18A For reference only 18
Appendix II
19 20
9/22/08
Value for schedule compression percentage given target below, cannot be greater than 14% =ROUNDUP(MAX(0, (1-($AB$18/$AA4))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA5))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA6))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA7))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA8))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA9))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA10))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA11))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA12))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA13))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA14))*100),0) Maximum Duration= 18.2 months
Value of duration given percent of schedule compression in months =(1-AC4/100)*AA4 =(1-AC5/100)*AA5 =(1-AC6/100)*AA6 =(1-AC7/100)*AA7 =(1-AC8/100)*AA8 =(1-AC9/100)*AA9 =(1-AC10/100)*AA10 =(1-AC11/100)*AA11 =(1-AC12/100)*AA12 =(1-AC13/100)*AA13 =(1-AC14/100)*AA14
Duration = (1-%compression/100) * (expected value of Duration)
Effort adjustment factor (EAF) due to schedule compres-sion, e.g., 1.05 =AC4/100+1 =AC5/100+1 =AC6/100+1 =AC7/100+1 =AC8/100+1 =AC9/100+1 =AC10/100+1 =AC11/100+1 =AC12/100+1 =AC13/100+1 =AC14/100+1
Page 7 of 8
CDCAAS
21 22
Appendix II
23
9/22/08
Computed value of Average Computed value of Adjusted Effort (AE) due to Computed value of Productivity Staff (AS), e.g. 92.6/11.4", (P), e.g. 20,500/92.6 =221 8.1 technical members; schedule compression in Staff Months EDIS/Staff Months =(AE)/(19) =AE4*W4 =AE5*W5 =AE6*W6 =AE7*W7 =AE8*W8 =AE9*W9 =AE10*W10 =AE11*W11 =AE12*W12 =AE13*W13 =AE14*W14 =(N4*10^3)/AF4 =(N5*10^3)/AF5 =(N6*10^3)/AF6 =(N7*10^3)/AF7 =(N8*10^3)/AF8 =(N9*10^3)/AF9 =(N10*10^3)/AF10 =(N11*10^3)/AF11 =(N12*10^3)/AF12 =(N13*10^3)/AF13 =(N14*10^3)/AF14 =AF4/AD4 =AF5/AD5 =AF6/AD6 =AF7/AD7 =AF8/AD8 =AF9/AD9 =AF10/AD10 =AF11/AD11 =AF12/AD12 =AF13/AD13 =AF14/AD14
Page 8 of 8
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Appendix III
Work Package Specifications CONFIGURATION MANAGEMENT
WORK PACKAGE SPECIFICATION: Configuration Management App-Detailed DesignWizards. Activity number: 1.6.1.1.3 Activity name: Configuration Management- Detailed Design- Wizards Feature description: New Features will be added to existing MS Visual SourceSafe software to enhance reliability, scalability, performance, and remote access capabilities using .NET Programmable Interoperable Assemblies (PIA). Activity Description: Create and review detailed design of Configuration Wizards Management Software. Estimated duration: 12 days. Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 3 Programmers, 1 QA Engineer, 1 COTS Expert. Skills: Visual Basic, Java. Tools: CM App, Visual Basic. Travel: None. Work Product: Detailed Design of Configuration wizards Management. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Design of Configuration Management Wizards. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Vicky Donofrio, Noel Kidman, Kate Rosenberg, Rizzo Rat, Hanuman Singh. Starting date: ___12/24/08____Completion Date: ___04/14/09____ Cost (budgeted/actual):____$1,339,346.60_____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
COMMUNICATION
WORK PACKAGE SPECIFICATION: Communication-Detailed Design Activity number: 1.6.1.2.3 Activity name: Customers Data Base Link - Application Function. Feature description: Display customer related information from centralized database when user calls in. Activity Description: Create functions to install/display customer information into/from database. Estimated duration: 4 days Resources needed: Personnel: 1 Team Leader, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 Tester. Skills: Visual Basic, GUI development, Java, C/C++, Access. Tools: GUI builder, MS Office, MS Visual Basic. Travel: None. Work Product: Detailed function and classes for Communication application. Risks: Lack of human factors skills. Predecessors: Detailed design of application complete, user interface complete. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Scooter, Greenburg, Jae Motor, Mary King, Jerry. Starting date: ___12/24/08____Completion Date: ___04/14/09____ Cost (budgeted/actual): ___$1,423,980.00____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
GRAPHIC PRESENTATION
WORK PACKAGE SPECIFICATION: Graphic Representation App-Detailed DesignPresentations. Activity number: 1.6.4.2.3 Activity name: Graphic Representation - Detailed Design- Presentations Feature description: Extensions to the standard Microsoft PowerPoint will include new features such as extracting embedded Power Point Slides from Word documents, improvements in animation engine, references to Active X controls, countdown timers, and quizzes. Activity Description: Create and review detailed design of Power Point Presentation with enhanced features from Microsoft PowerPoint Estimated duration: 15 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer, 1 Component Expert, Skills: Visual Basic, Multimedia Tools Tools: Graphic Representation App, WBS Chart Pro, Visual Basic Travel: None Work Product: Detailed Design of Graphic Representation - Power Point Presentation Risks: Lack of human factors skills Predecessors: Requirements and High Level Design of Power Point Presentation Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Pumpkin Chattaroy, Road Romeo, Street Caesar, Jerry, Tom. Starting date: ____12/15/09___Completion Date: ___03/08/10____ Cost (budgeted/actual):____$315,694.46_____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
WORD PROCESSING
WORK PACKAGE SPECIFICATION: Word Processing Application -Detailed Design- HTML Activity number: 1.6.1.3.3 Activity name: Word Processing- Detailed Design- HTML Feature description: The standard Microsoft Word software will be integrated with Semantic Word to offer features such as an automatic information extraction system and the tools for refining and augmenting its output, customizable tools for simultaneous generation of content and semantic annotations and an annotation scheme that allows for annotations to be reused when content is reused. Activity Description: Create and review detailed design of Word Processing with HTML Estimated duration: 09 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 3 Programmers, 1 QA Engineer, 1 Component Expert, Skills: Visual Basic, HTML, Java Tools: HTML, Visual Basic Travel: None Work Product: Detailed Design of Word Processing with HTML. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Word Processing with HTML. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Sam Eagle, Stacey Vault, Carol Monkey, Sonica Eagle, Charles Rosenburg. Starting Date: ____12/15/08____Completion Date: ____03/06/09____ Cost (budgeted/actual):____$754,742.35____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
PROJECT MANAGEMENT
WORK PACKAGE SPECIFICATION: PM App-Detailed Design- WBS Integration Activity number: 1.6.4.3.3 Activity name: PM-Detailed Design- WBS Integration Feature description: Existing MS Project software will be enhanced by integrating WBS Chart Pro to include features such as capabilities of creating Work Breakdown structures Activity Description: Create and review detailed design of integration of WBS Chart Pro into existing Project Manager App Estimated duration: 20 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer Skills: Visual Basic, WBS Chart Pro Tools: PM App, WBS Chart Pro, Visual Basic Travel: None Work Product: Detailed Design of WBS Chart Pro integration Risks: Lack of human factors skills Predecessors: Requirements and High Level Design of WBS Integration Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Miss Piggy, Waldorf Bird, Miss Queen, Kate Nelson, Wanda Briggs. Starting date: ___12/15/09____Completion Date: __02/08/10____ Cost (budgeted/actual):____ $315,852.11_____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
COMPILE/LINK/RUNTIME
WORK PACKAGE SPECIFICATION: COMP Detailed Design Activity number: 1.6.3.2.2 Activity name: COMP Detailed Design Feature description: Extend user interface to allow user to compile Java language. Activity Description: Create detailed design user interface function to integrate and compile Java codes. Estimated duration: 10 days Resources needed: Personnel: 1 Team Leader, 1 Requirement Analyst, 1 System Engineer, 1 Senior Software Engineers, 2 Programmers. Skills: GUI Development, Java. Tools: GUI builder, Java, Travel: None Work Product: Detailed Design of Java Compiling functionality user interface. Risks: Lack of human factors skills. Predecessors: Detailed design of compiling user interface complete. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Neil Armstrong, Jeff Marcus, Mickey Mouse, Teddy Brown. Starting date: ___ 05/18/09____Completion Date: ___ 08/07/09____ Cost (budgeted/actual):____ $315,602.20 _____ Legacy comments:
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer
Team Crime Busters consists of, from left to right, Brian H. Park, John Kraus, Sonal Verma and Akhil Pathania.