Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

White Paper

8 Steps to Measure ADM Vendor Deliverables


Ensure Structural Quality with Software Analysis & Measurement
As enterprise IT departments increasingly move towards multi-sourcing environments, it is more important than ever to measure ADM deliverablesnot only to manage risks by ensuring overall structural quality of systems, but also to objectively evaluate vendors and make smarter sourcing decisions. This paper describes the eight steps for integrating Software Analysis & Measurement (SAM) in your outsourcing relationship lifecyclefrom RFP preparation to contract development, team transition and benchmarkingto objectively evaluate the reliability, security, efficiency, maintainability, and size of software deliverables. This measurement can greatly improve the maturity in your outsourcing relationships to enhance performance and reduce risk.

8 Steps to Measure ADM Vendor Deliverables Page 2

Contents
I. II. Introduction Transform Structural Quality Review from an Art to a Science III. Leveraging SAM in Outsourcing - 8 Steps IV. Identify the ideal operating scenario V. Select a SAM solution that meets business needs

I.

Introduction

VI. Conclusion

To meet high demands from the business, systems are becoming increasingly complex and the frequency of change is growing exponentially. As a result, tradeoffs are made when it comes to application structural quality and the inherent risk built into these systems accumulates. And since software is at the core of virtually every business, any breakdown in mission-critical applications can potentially result in hundreds of millions of dollars in losses, not to mention the hit to the companys reputation, goodwill, and credibility with customers and investors. A review of most recent high profile software failures indicates that the root cause of a majority of these failures was poor quality of code. These pressures are further exacerbated by the growing complexity in outsourcing, which is not just about cost savings anymore. Outsourcing partners can bring increased flexibility and expertise on-demand, and by building strategic relationships you can respond to business faster. However, outsourcing also means less technical expertise in-house, and a loss of control over the quality of code being developed and the resources developing the code. This is especially acute in an offshore outsourcing scenario, where lower experience levels combined with high attrition rate can in turn have a compounding effect on the inherent risk that gets accumulated into systems. If unchecked, application software can become ticking time bombs. Most vendor management organizations are becoming more mature and sophisticated in managing outsourcing engagements, and they are looking for guidance on measuring vendors in an objective way. Despite many resources detailing how to structure outsourcing SLAs and the related metrics, there is a dearth of information on how to assess and measure the deliverables agreed upon in the SLAs. This paper offers eight ways that Software Analysis & Measurement (SAM) can help to mitigate and manage risk in outsourced applications by measuring the structural quality of vendor deliverables.

II.

Transform Structural Quality Review from an Art to a Science

Source code review comes in two formsmanual and automated analysis. Manual source code review is labor-intensive and subjective, and requires highly-skilled software experts. Moreover, it is not possible for a single individual to have the kind of expertise needed to review an application across multiple technologies. Measuring the structural quality of software applications is evolving from an art to a science with the availability of solutions that automate the process of code analysis. Automated analysis provides an objective, in-depth review of the entire codebase, including source code, scripting, and interface languages across all layers of an application, against hundreds of best practices in a fraction of time it would take for manual analysis.
www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 3

SAM focuses on the structural quality of the entire applicationrather than individual components that are typically evaluated by unit tests and code analyzersevaluating how its architecture adheres to sound principles ofsoftware engineering. The Consortium for IT Software Quality (CISQ) has defined the four major structural quality characteristics and a size measure1 that is needed to evaluate the overall health of an application, and consequently providebusiness value: Reliability, Efficiency, Security, Maintainability, and (adequate) Size. These characteristics are the primary pillars of evaluation in SAM and can be computed through a qualitative or quantitative scoring scheme, or a mix of both, and then a weighting system reflecting the priorities of a business.
Table 1 - Software quality characteristics defined by CISQ
1

III.

Leveraging SAM in Outsourcing - 8 Steps

SAM is becoming increasingly prevalent in the industry since it not only sheds light on the risks in software deliverables, but can be used to greatly improve the maturity of an organization throughout the lifecycle of an outsourcing relationship. In this section we will discuss in detail how a SAM solution can add value in each of the eight important steps in an outsourcing engagement, as noted in Figure 1.

www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 4

Highlight

Figure 1 - Leveraging SAM throughout the outsourcing life-cycle

SAM is becoming increasingly prevalent in the industry since it not only sheds light on the risks in software deliverables, but can be used to greatly improve the maturity of an organization throughout the lifecycle of an outsourcing relationship

1. Prepare data prior to outsourcing


Before transferring applications to an outsourcing partner, it is good practice to perform SAM to: Ensure the availability of objective information that offers a realistic picture of the true quality of the applications, by determining baseline quality and size. Make decisions on the best applications for outsourcing or shortlist best outsourcing partners based on the risk indicators. Often in outsourcing relationships, clients indicate that they are unhappy with the quality of the code being delivered, without realizing that the application was of poor quality to begin with. You also may want to avoid outsourcing an application with poor quality and inherently high risk, as it might further increase the risk. On the contrary, you might want to bring in an outsourcing partner to address specific issues in the application.

2. Include application intelligence in RFPs


It is highly recommended that you include SAM outputs in RFP documentation during the bidding process with potential vendors. This information can provide bidders accurate and objective information about:
www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 5

Highlight

Technical size (lines of code, number of files, number of tables, etc.) Functional size (Function Points) Technology distribution (% of code that is Java, JSP, XML, SQL, .NET, COBOL, etc.) Complexity (cyclomatic complexity, fan-in, fan-out, etc.) Structural quality metrics (reliability, efficiency, security, maintainability) Architectural blueprint with dependencies between various modules With this level of application intelligence, vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability. Whenever there is reluctance to share too much dirty laundry with the vendor during pursuit, its worth pointing out that the problems handed over are not going away. The only impact lack of transparency has on the process is to force the vendor to put more risk padding into the proposal.

It is recommended to include SAM outputs in RFPs, since with this level of application intelligence, vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability

3. Get feedback on quality during vendor evaluation


As part of the evaluation process, vendors should be asked to provide an assessment of the applications based on the SAM outputs provided. This not only ensures their understanding of the scope of the work and technical aspects, but also of the structural quality of the applications. In addition, part of the technical requirement should be to improve the overall structural quality of the existing code being adopted by them. If appropriate, they should provide a detailed plan and roadmap to improve the quality of the applications in future releases.

4. Reference SAM metrics in initial contract setup (SLAs and acceptance criteria)
While it might seem obvious to hold outsourced teams accountable for the intrinsic quality of the product itself, acceptance criteria tied to structural quality have only recently started to show up in SLAs. Only in the last few years has there been an effective way to measure structural quality in a comprehensive way. One of the most important benefits of using SAM in an outsourcing context is to leverage it in contract language and make it part of a Service Level Agreement or Acceptance Criteria. There are primarily three categories of outputs, representing a combination of higher-level and lower-level structural quality metrics of software that can be incorporated into SLAs to achieve a specific business need or objective: Quality Indices, Application-Specific Rules, and Productivity.
www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 6

Highlight

Data from advanced solutions like CAST AIP provide the number of Function Points that have been modified, added, and deleted in a release can be combined with the development hours spent for a given release to generate productivity measures

Quality Indices Quality indices described in section 2 (Reliability, Efficiency, Security, and Maintainability) can be used to set high-level goals for overall application health. Ideally, applications should be analyzed for a minimum of two to three releases, and the average scores used as a baseline for each of these health factors. You can then set targets to monitor the overall health of the application over time. Application-Specific Rules The Quality indices provide a macro picture of the structural quality of an application. However, there are also often specific code patterns (rules) that you want to avoid. For example, if the application is already suffering from performance issues, you want to make sure to avoid any code that would further degrade performance. These specific rules should be incorporated into SLAs as Critical Rules with zero tolerance. Productivity SAM solutions provide the size of the code base that is added in a given release. Along with KLOC (kilo lines of code), some advanced solutions like CAST Application Intelligence Platform (AIP) provide data on the number of Function Points that have been modified, added, and deleted in a release. This information can be combined with the development hours spent for a given release to generate productivity measures like KLOC/man-hour or Function Points/man-hour. This is a very relevant metric to track, especially in a multi-vendor scenario, so you can compare charges from different service providers and monitor productivity for each vendor. However, care should be taken to put productivity metrics into context since the amount of time spent on a given release cant be fully derived only from the actual source code delivered. For example, the configuration tasks related to a software package. It can also take some time to understand the existing code, as it can be quite different from one technology to another, from one architecture to another, or from one team to another. In addition, more often than not user requirements are rarely finalized and keep changing throughout the projectmaking timelines less productive. Moreover, quite often service providers have their own proprietary packages or components that software analysis solutions are not able to access and are therefore not reflected in quantity-related outputs. Automated Function Points is an objective, critical input that forms part of the overall productivity story. This type of productivity information can be very useful when monitoring an outsourcer, and when combined with quality outputs and other indicators
www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 7

Highlight

Transitioning code to a vendor team is one of the most difficult parts of an outsourcing engagementsoftware analysis, especially system-level analysis solutions can efficiently assist in the transition process

such as the amount of hours spent, can provide some insights into why a specific release took more man hours per Function Point than other releasesand help identify sources of productivity improvement. SLAs vs. Acceptance Criteria: It is important to determine when to use this type of data in SLAs and when to use it as acceptance criteria. The recommended best practice is to use SAM data as part of acceptance criteria (shown in table 2) before accepting vendor deliverables for system testing or user acceptance testing (UAT). If the deliverable does not meet predefined criteria, it should not be accepted to avoid wasting the time of the testing teams or users. On the other hand, data gathered from analyzing the application before it is put into production can be used to check against SLA performance. Setting Targets for SLAs: When setting targets in SLAs for structural quality of software, it is recommended to collect baseline data first. As previously mentioned, ideally this should be based on the average over two to three prior releases. In the case of greenfield projects, since there will be no source code to analyze and create a baseline, it is recommended to use industry benchmark data to set the targets to match the scores in the top quartile of the scores for that specific technology. For example, the CAST benchmarking database has data from more than two thousand applications collected from different industries across all technologies. For a more thorough discussion of this topic please see the white paper CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs2

5. Use documentation created during analysis to ease transition


Transitioning code to a vendor team is one of the most difficult parts of an outsourcing engagement, since documentation is typically out of date and the original team that wrote the application may not be available for knowledge transfer. Ask your vendor how they facilitate their teams transition and understanding of the system, identify and monitor the structural hotspots, and perform impact analysis on system changes and movement. Software analysis, especially system-level analysis solutions, can facilitate the transition process. As the code is analyzed, the analyzers reverse engineer the code and create a comprehensive blueprint of the entire application, creating documentation that is current and accurate. This information will not only greatly reduce transition time, but also helps the vendor teams to reduce code duplication and understand system dependencies so they can thoroughly test as new additions are made.

www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 8

Table 2 - Sample acceptance criteria for structural quality

6. Evaluate vendor performance with objective measures


Quality performance evaluation is a sensitive subject, so its important to have an agreed-upon, independent assessment of quality that is repeatable. SAM outputs should be an important part of vendor performance score cards. Clients can evaluate the performance of vendors against SLAs and provide specific guidance with actionable lists (shown in table 3).

7. Incorporate benchmarking into evaluation


Benchmarking of SAM metrics can be very useful to identify opportunities for improvement, and can be done among a group of applications within the organization or with similar technology applications from industry peers. Internal Benchmarking: Automated software analysis provides an objective and consistent measurement that can be used to benchmark different teams within the same vendor or benchmark the performance of different vendors in a multi-sourcing scenario.

www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 9


Table 3 - Sample performance review process

Figure 2 - Sample improvement quality control chart with new targets

3.4 3.46%

3.4 3 3 3.05

New Expected Cap New Minimum

Months

www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 10

Highlight

Many organizations create leader boards based on SAM outputs, to provide visibility into the ongoing performance of dierent teams within the same vendor or different vendors in a multi-sourcing scenario

Benchmarking allows you to have a meaningful fact-based dialogue with vendors on opportunities for improvement and to measure their progress. External Benchmarking: In addition to internal benchmarking, companies can benchmark the structural quality of their applications to their industry peers. For example, CAST publishes benchmarking information for applications across a wide range of industries and technologies, through its Appmarq database. An example of this benchmarking data is shown in Figure 3. Leader Boards: In addition to formal benchmarking, companies can use SAM data to publish leader boards to highlight the applications with the highest structural quality or teams that are most productive. Leader boards can be very effective to motivate teams to improve their performance.

8. Strive for continuous improvement


As well as ensuring the quality of deliverables from vendors, SAM metrics can be used to improve quality on a continuous basis. Most applications have been existence several years prior to being outsourced, so there may be some code of poor quality, a lot of copy-pasted code, and several security vulnerabilities. Usually vendors are not responsible for all the issues that they have inherited, unless they have been hired exclusively to remedy them. However, companies can use SAM as a way to ensure that in addition to making sure the new code is risk free and is of higher quality, the existing code base can be improved by setting improvement targets that can be revised on an annual basis.

Figure 3 - Sample of benchmarking chart for .NET application in financial services industry

Application Health Score

Quality Characteristics
www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 11

IV.

Identify the ideal operating scenario

There are many ways to incorporate a SAM solution into your organization for monitoring vendor deliverables. Table 4 provides various scenarios. Figure 4 illustrates a typical integration of a SAM solution into the SDLC of a software development organization, as an example for scenarios 1 and 2 outlined in Table 4.
Table 4 - Different operating scenarios for SAM solution

Figure 4 - A sample integration of SAM solution into SDLC and IT executive dashboards

www.castsoftware.com

8 Steps to Measure ADM Vendor Deliverables Page 12

V.

Select a SAM solution that meets business needs

It is important to understand that there are two broad categories of solutions to measure software structural quality, and that SAM solutions offer a variety of capabilities, ranging from developer-centric tools to enterprise-wide solutions. The first category of solutions measures code quality of individual components, which are language-specific and narrowly focused. The second category of solutions measures application quality, in addition to analyzing the code at component leveland importantly, they analyze how these components interact with one another across multiple layers (UI, logic and data) and multiple technologies. The exact same piece of code can be safe and excellent quality, or highly dangerous, depending on its interaction with other components. Mission-critical applications must be analyzed in the context of the numerous interconnections among code components, databases, middleware, frameworks, and APIs. This results in a holistic analysis of the structural quality of an application. Figure 5 summarizes the different types of solutions and their uses. In the context of measuring outsourcing vendors, it is recommended to use a comprehensive system-level analysis solution that has the following key attributes: Proactive indication and scoring of risk Historical baselining and trending Consistent measures across teams working on diverse technologies Standards-based measures that can be benchmarked against industry peers Objective, unbiased KPIs
Figure 5 - High level comparison of different types of software analysis solutions

www.castsoftware.com

References
1. CISQ Specifications for Automated Quality Characteristic Measures. http://it-cisq. org/, CISQ Technical Work Group. (2012) CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs. http://it-cisq.org/, CISQ. (2012) 3. How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations. http:// www.omg.org/marketing/CISQ_compliant_IT_Systemsv.4-3.pdf, Dr. Richard Mark Soley (2012) The Importance of Application Quality and Its Difference from Code Quality. http:// www.castsoftware.com/resources/document/whitepapers/the-importance-of-application-quality-and-its-difference-fromcode-quality, CAST Software. (2011) Sample Acceptance Criteria with Structural Quality Metrics. http://www.castsoftware.com/sample-sla

VI. Conclusion
Increasing demand for complex IT projects and the constantly evolving technology landscape means that outsourcing is no longer an option anymoreit has become a requirement for any large organization. However, many organizations, in spite of implementing best practices, are struggling to achieve a mutually beneficial relationship with their outsourcing partners. Measuring vendors through automated analysis not only minimizes risks in applications, but also greatly increases the maturity of these outsourcing relationships by properly aligning measurement with the overall business and IT organization objectives.

2.

About CAST
CAST is a pioneer and world leader in Software Analysis and Measurement, with unique technology resulting from more than $100 million in R&D investment. CAST introduces fact-based transparency into application development and sourcing to transform it into a management discipline. More than 250 companies across all industry sectors and geographies rely on CAST to prevent business disruption while reducing hard IT costs. CAST is an integral part of software delivery and maintenance at the worlds leading IT service providers such as IBM and Capgemini. Founded in 1990, CAST is listed on NYSE-Euronext (Euronext: CAS) and serves IT intensive enterprises worldwide with a network of offices in North America, Europe and India. For more information, visit www. castsoftware.com.

4.

5.

Questions? Email us at contact@castsoftware.com

www.castsoftware.com Europe 3 rue Marcel Allgot 92190 Meudon - France Phone: +33 1 46 90 21 00 North America 373 Park Avenue South New York, NY 10016 Phone:+1 212-871-8330

You might also like