Professional Documents
Culture Documents
Eight Steps To Measure ADM Vendor Deliverables
Eight Steps To Measure ADM Vendor Deliverables
Contents
I. II. Introduction Transform Structural Quality Review from an Art to a Science III. Leveraging SAM in Outsourcing - 8 Steps IV. Identify the ideal operating scenario V. Select a SAM solution that meets business needs
I.
Introduction
VI. Conclusion
To meet high demands from the business, systems are becoming increasingly complex and the frequency of change is growing exponentially. As a result, tradeoffs are made when it comes to application structural quality and the inherent risk built into these systems accumulates. And since software is at the core of virtually every business, any breakdown in mission-critical applications can potentially result in hundreds of millions of dollars in losses, not to mention the hit to the companys reputation, goodwill, and credibility with customers and investors. A review of most recent high profile software failures indicates that the root cause of a majority of these failures was poor quality of code. These pressures are further exacerbated by the growing complexity in outsourcing, which is not just about cost savings anymore. Outsourcing partners can bring increased flexibility and expertise on-demand, and by building strategic relationships you can respond to business faster. However, outsourcing also means less technical expertise in-house, and a loss of control over the quality of code being developed and the resources developing the code. This is especially acute in an offshore outsourcing scenario, where lower experience levels combined with high attrition rate can in turn have a compounding effect on the inherent risk that gets accumulated into systems. If unchecked, application software can become ticking time bombs. Most vendor management organizations are becoming more mature and sophisticated in managing outsourcing engagements, and they are looking for guidance on measuring vendors in an objective way. Despite many resources detailing how to structure outsourcing SLAs and the related metrics, there is a dearth of information on how to assess and measure the deliverables agreed upon in the SLAs. This paper offers eight ways that Software Analysis & Measurement (SAM) can help to mitigate and manage risk in outsourced applications by measuring the structural quality of vendor deliverables.
II.
Source code review comes in two formsmanual and automated analysis. Manual source code review is labor-intensive and subjective, and requires highly-skilled software experts. Moreover, it is not possible for a single individual to have the kind of expertise needed to review an application across multiple technologies. Measuring the structural quality of software applications is evolving from an art to a science with the availability of solutions that automate the process of code analysis. Automated analysis provides an objective, in-depth review of the entire codebase, including source code, scripting, and interface languages across all layers of an application, against hundreds of best practices in a fraction of time it would take for manual analysis.
www.castsoftware.com
SAM focuses on the structural quality of the entire applicationrather than individual components that are typically evaluated by unit tests and code analyzersevaluating how its architecture adheres to sound principles ofsoftware engineering. The Consortium for IT Software Quality (CISQ) has defined the four major structural quality characteristics and a size measure1 that is needed to evaluate the overall health of an application, and consequently providebusiness value: Reliability, Efficiency, Security, Maintainability, and (adequate) Size. These characteristics are the primary pillars of evaluation in SAM and can be computed through a qualitative or quantitative scoring scheme, or a mix of both, and then a weighting system reflecting the priorities of a business.
Table 1 - Software quality characteristics defined by CISQ
1
III.
SAM is becoming increasingly prevalent in the industry since it not only sheds light on the risks in software deliverables, but can be used to greatly improve the maturity of an organization throughout the lifecycle of an outsourcing relationship. In this section we will discuss in detail how a SAM solution can add value in each of the eight important steps in an outsourcing engagement, as noted in Figure 1.
www.castsoftware.com
Highlight
SAM is becoming increasingly prevalent in the industry since it not only sheds light on the risks in software deliverables, but can be used to greatly improve the maturity of an organization throughout the lifecycle of an outsourcing relationship
Highlight
Technical size (lines of code, number of files, number of tables, etc.) Functional size (Function Points) Technology distribution (% of code that is Java, JSP, XML, SQL, .NET, COBOL, etc.) Complexity (cyclomatic complexity, fan-in, fan-out, etc.) Structural quality metrics (reliability, efficiency, security, maintainability) Architectural blueprint with dependencies between various modules With this level of application intelligence, vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability. Whenever there is reluctance to share too much dirty laundry with the vendor during pursuit, its worth pointing out that the problems handed over are not going away. The only impact lack of transparency has on the process is to force the vendor to put more risk padding into the proposal.
It is recommended to include SAM outputs in RFPs, since with this level of application intelligence, vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability
4. Reference SAM metrics in initial contract setup (SLAs and acceptance criteria)
While it might seem obvious to hold outsourced teams accountable for the intrinsic quality of the product itself, acceptance criteria tied to structural quality have only recently started to show up in SLAs. Only in the last few years has there been an effective way to measure structural quality in a comprehensive way. One of the most important benefits of using SAM in an outsourcing context is to leverage it in contract language and make it part of a Service Level Agreement or Acceptance Criteria. There are primarily three categories of outputs, representing a combination of higher-level and lower-level structural quality metrics of software that can be incorporated into SLAs to achieve a specific business need or objective: Quality Indices, Application-Specific Rules, and Productivity.
www.castsoftware.com
Highlight
Data from advanced solutions like CAST AIP provide the number of Function Points that have been modified, added, and deleted in a release can be combined with the development hours spent for a given release to generate productivity measures
Quality Indices Quality indices described in section 2 (Reliability, Efficiency, Security, and Maintainability) can be used to set high-level goals for overall application health. Ideally, applications should be analyzed for a minimum of two to three releases, and the average scores used as a baseline for each of these health factors. You can then set targets to monitor the overall health of the application over time. Application-Specific Rules The Quality indices provide a macro picture of the structural quality of an application. However, there are also often specific code patterns (rules) that you want to avoid. For example, if the application is already suffering from performance issues, you want to make sure to avoid any code that would further degrade performance. These specific rules should be incorporated into SLAs as Critical Rules with zero tolerance. Productivity SAM solutions provide the size of the code base that is added in a given release. Along with KLOC (kilo lines of code), some advanced solutions like CAST Application Intelligence Platform (AIP) provide data on the number of Function Points that have been modified, added, and deleted in a release. This information can be combined with the development hours spent for a given release to generate productivity measures like KLOC/man-hour or Function Points/man-hour. This is a very relevant metric to track, especially in a multi-vendor scenario, so you can compare charges from different service providers and monitor productivity for each vendor. However, care should be taken to put productivity metrics into context since the amount of time spent on a given release cant be fully derived only from the actual source code delivered. For example, the configuration tasks related to a software package. It can also take some time to understand the existing code, as it can be quite different from one technology to another, from one architecture to another, or from one team to another. In addition, more often than not user requirements are rarely finalized and keep changing throughout the projectmaking timelines less productive. Moreover, quite often service providers have their own proprietary packages or components that software analysis solutions are not able to access and are therefore not reflected in quantity-related outputs. Automated Function Points is an objective, critical input that forms part of the overall productivity story. This type of productivity information can be very useful when monitoring an outsourcer, and when combined with quality outputs and other indicators
www.castsoftware.com
Highlight
Transitioning code to a vendor team is one of the most difficult parts of an outsourcing engagementsoftware analysis, especially system-level analysis solutions can efficiently assist in the transition process
such as the amount of hours spent, can provide some insights into why a specific release took more man hours per Function Point than other releasesand help identify sources of productivity improvement. SLAs vs. Acceptance Criteria: It is important to determine when to use this type of data in SLAs and when to use it as acceptance criteria. The recommended best practice is to use SAM data as part of acceptance criteria (shown in table 2) before accepting vendor deliverables for system testing or user acceptance testing (UAT). If the deliverable does not meet predefined criteria, it should not be accepted to avoid wasting the time of the testing teams or users. On the other hand, data gathered from analyzing the application before it is put into production can be used to check against SLA performance. Setting Targets for SLAs: When setting targets in SLAs for structural quality of software, it is recommended to collect baseline data first. As previously mentioned, ideally this should be based on the average over two to three prior releases. In the case of greenfield projects, since there will be no source code to analyze and create a baseline, it is recommended to use industry benchmark data to set the targets to match the scores in the top quartile of the scores for that specific technology. For example, the CAST benchmarking database has data from more than two thousand applications collected from different industries across all technologies. For a more thorough discussion of this topic please see the white paper CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs2
www.castsoftware.com
www.castsoftware.com
3.4 3.46%
3.4 3 3 3.05
Months
www.castsoftware.com
Highlight
Many organizations create leader boards based on SAM outputs, to provide visibility into the ongoing performance of dierent teams within the same vendor or different vendors in a multi-sourcing scenario
Benchmarking allows you to have a meaningful fact-based dialogue with vendors on opportunities for improvement and to measure their progress. External Benchmarking: In addition to internal benchmarking, companies can benchmark the structural quality of their applications to their industry peers. For example, CAST publishes benchmarking information for applications across a wide range of industries and technologies, through its Appmarq database. An example of this benchmarking data is shown in Figure 3. Leader Boards: In addition to formal benchmarking, companies can use SAM data to publish leader boards to highlight the applications with the highest structural quality or teams that are most productive. Leader boards can be very effective to motivate teams to improve their performance.
Figure 3 - Sample of benchmarking chart for .NET application in financial services industry
Quality Characteristics
www.castsoftware.com
IV.
There are many ways to incorporate a SAM solution into your organization for monitoring vendor deliverables. Table 4 provides various scenarios. Figure 4 illustrates a typical integration of a SAM solution into the SDLC of a software development organization, as an example for scenarios 1 and 2 outlined in Table 4.
Table 4 - Different operating scenarios for SAM solution
Figure 4 - A sample integration of SAM solution into SDLC and IT executive dashboards
www.castsoftware.com
V.
It is important to understand that there are two broad categories of solutions to measure software structural quality, and that SAM solutions offer a variety of capabilities, ranging from developer-centric tools to enterprise-wide solutions. The first category of solutions measures code quality of individual components, which are language-specific and narrowly focused. The second category of solutions measures application quality, in addition to analyzing the code at component leveland importantly, they analyze how these components interact with one another across multiple layers (UI, logic and data) and multiple technologies. The exact same piece of code can be safe and excellent quality, or highly dangerous, depending on its interaction with other components. Mission-critical applications must be analyzed in the context of the numerous interconnections among code components, databases, middleware, frameworks, and APIs. This results in a holistic analysis of the structural quality of an application. Figure 5 summarizes the different types of solutions and their uses. In the context of measuring outsourcing vendors, it is recommended to use a comprehensive system-level analysis solution that has the following key attributes: Proactive indication and scoring of risk Historical baselining and trending Consistent measures across teams working on diverse technologies Standards-based measures that can be benchmarked against industry peers Objective, unbiased KPIs
Figure 5 - High level comparison of different types of software analysis solutions
www.castsoftware.com
References
1. CISQ Specifications for Automated Quality Characteristic Measures. http://it-cisq. org/, CISQ Technical Work Group. (2012) CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs. http://it-cisq.org/, CISQ. (2012) 3. How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations. http:// www.omg.org/marketing/CISQ_compliant_IT_Systemsv.4-3.pdf, Dr. Richard Mark Soley (2012) The Importance of Application Quality and Its Difference from Code Quality. http:// www.castsoftware.com/resources/document/whitepapers/the-importance-of-application-quality-and-its-difference-fromcode-quality, CAST Software. (2011) Sample Acceptance Criteria with Structural Quality Metrics. http://www.castsoftware.com/sample-sla
VI. Conclusion
Increasing demand for complex IT projects and the constantly evolving technology landscape means that outsourcing is no longer an option anymoreit has become a requirement for any large organization. However, many organizations, in spite of implementing best practices, are struggling to achieve a mutually beneficial relationship with their outsourcing partners. Measuring vendors through automated analysis not only minimizes risks in applications, but also greatly increases the maturity of these outsourcing relationships by properly aligning measurement with the overall business and IT organization objectives.
2.
About CAST
CAST is a pioneer and world leader in Software Analysis and Measurement, with unique technology resulting from more than $100 million in R&D investment. CAST introduces fact-based transparency into application development and sourcing to transform it into a management discipline. More than 250 companies across all industry sectors and geographies rely on CAST to prevent business disruption while reducing hard IT costs. CAST is an integral part of software delivery and maintenance at the worlds leading IT service providers such as IBM and Capgemini. Founded in 1990, CAST is listed on NYSE-Euronext (Euronext: CAS) and serves IT intensive enterprises worldwide with a network of offices in North America, Europe and India. For more information, visit www. castsoftware.com.
4.
5.
www.castsoftware.com Europe 3 rue Marcel Allgot 92190 Meudon - France Phone: +33 1 46 90 21 00 North America 373 Park Avenue South New York, NY 10016 Phone:+1 212-871-8330