8 Steps to Measure ADM Vendor Deliverables



Similar documents
White Paper 6 Steps to Enhance Performance of Critical Systems

Testing Metrics. Introduction

AGILE SOFTWARE TESTING

Comprehensive Testing Services for Life Insurance Systems

Leveraging data analytics and continuous auditing processes for improved audit planning, effectiveness, and efficiency. kpmg.com

Cisco Network Optimization Service

The Last Outsourcing Frontier: Software Testing?

How To Choose A Test Maturity Assessment Model

Service Integration. Ensuring the best IT service providers deliver the best IT service

Controlling Risk Through Software Code Governance

Motivation & Competitiveness Framework for Application Support Teams

The Last Outsourcing Frontier: Software Testing?

White Paper Software Quality Management

ScienceLogic vs. Open Source IT Monitoring

Metrics that Matter Security Risk Analytics

Transforming life sciences contract management operations into sustainable profit centers

How To Improve Your Business

W H I T E P A P E R T h e B u s i n e s s V a l u e o f P r o a c t i v e S u p p o r t S e r v i c e s

Improving Java Migration Outcomes with Rapid Assessment

The IT Financial Management Challenge: Where Is the ROI? WHITE PAPER

Business Intelligence and Analytics: Leveraging Information for Value Creation and Competitive Advantage

PMO Starter Kit. White Paper

Skatteudvalget (2. samling) SAU Alm.del Bilag 48 Offentligt. Programme, Project & Service Management Analysis

IBM Tivoli Netcool network management solutions for enterprise

SALES AND OPERATIONS PLANNING BLUEPRINT BUSINESS VALUE GUIDE

Four Steps to Faster, Better Application Dependency Mapping

Enterprise Data Management for SAP. Gaining competitive advantage with holistic enterprise data management across the data lifecycle

The Logicalis Data Center Practice We help you bridge the gap between the data center you have today and the data center strategy you will need

Enaxis Consulting Overview

Wealth management offerings for sustainable profitability and enhanced client centricity

10 Best Practices in Printer Fleet Management

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into

Meeting the challenge of software quality and maximizing return on investment Performance driven. Quality assured.

Effective Software Security Management

The following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into

ITSM 101. Patrick Connelly and Sandeep Narang. Gartner.

Brocade Network Monitoring Service (NMS) Helps Maximize Network Uptime and Efficiency

Industry Metrics for Outsourcing and Vendor Management

Metrics by design A practical approach to measuring internal audit performance

Implement a unified approach to service quality management.

HR Function Optimization

Managing Amazon Web Services within a Hybrid IT model

Reduce IT Costs by Simplifying and Improving Data Center Operations Management

HP Fortify Software Security Center

ANALYTICS STRATEGIES FOR INSURANCE

Ingres Insights. with Open Source Software

Get Significant Application Quality Improvement Without Major Investment. Performance driven. Quality assured.

Advanced Remote Monitoring: Managing Today s Pace of Change

Business Logistics Specialist Position Description

The future of application outsourcing: making the move from tactical to strategic

Transforming risk management into a competitive advantage kpmg.com

A Strategic Approach to Web Application Security

Global Supply Chain Control Towers

Application Outsourcing: The management challenge

A proven 5-step framework for managing supplier performance

See what cloud can do for you.

Client Onboarding Process Reengineering: Performance Management of Client Onboarding Programs

BENCHMARKING THE ENTERPRISE S B2B INTEGRATION MATURITY

QEx WHITEPAPER. Increasing Cost Predictability in Performance Testing Services via Unit-Based Pricing Model.

Network Health Framework: A Proactive Approach

Take Control with Managed Print Services

1. Introduction. Annex 7 Software Project Audit Process

A collaborative and customized approach to sourcing testing and quality assurance services Performance driven. Quality assured.

SOLUTION BRIEF CA SERVICE MANAGEMENT - SERVICE CATALOG. Can We Manage and Deliver the Services Needed Where, When and How Our Users Need Them?

Seven Practical Steps to Delivering More Secure Software. January 2011

Leveraging Agile and CMMI for better Business Benefits Presented at HYDSPIN Mid-year Conference Jun-2014

I D C T E C H N O L O G Y S P O T L I G H T

Au t o n o m i c s - Ap p l i e d Ap p l i c a t i o n M a n agement

VMware Virtualization and Cloud Management Solutions. A Modern Approach to IT Management

How do you manage the growing complexity of software development? Is your software development organization as responsive to your business needs as

building and sustaining productive working relationships p u b l i c r e l a t i o n s a n d p r o c u r e m e n t

NetApp OnCommand Management Software Storage and Service Efficiency

Delivering Business Intelligence with Open Source Software

IT Cost Reduction. Doing More with Less. Anita Ballaney, Vishwanath Shenoy, Michael Gavigan. Strategic IT cost reduction - Doing More with Less

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.

The top 10 misconceptions about performance and availability monitoring

Overview of Services. From Cost Control to Compliance and Counsel to Communication. A proud member of Retirement Plan Advisory Group

Services for the CFO Financial Management Consulting

How To Manage Risk With Sas

BUSINESS INTELLIGENCE: IT'S TIME TO TAKE PRIVATE EQUITY TO THE NEXT LEVEL. by John Stiffler

MANAGING INFORMATION CDP ROADMAP GUIDE CLIMATE CHANGE REPORTING:

Banking Application Modernization and Portfolio Management

Software Project Audit Process

Selecting Enterprise Software

Elastic Application Platform for Market Data Real-Time Analytics. for E-Commerce

Application Security in the Software Development Lifecycle

3 rd Party Vendor Risk Management

Service Integration and Maintenance (SIAM) Outsourcing

Solution White Paper BMC Service Resolution: Connecting and Optimizing IT Operations with the Service Desk

FireScope + ServiceNow: CMDB Integration Use Cases

Hospital Performance Management: From Strategy to Operations

END TO END DATA CENTRE SOLUTIONS COMPANY PROFILE

ORACLE SOURCING & SOURCING OPTIMIZATION

A technical paper for Microsoft Dynamics AX users

MANAGED SECURITY SERVICES: WHEN IT'S TIME TO STOP GOING "IT" ALONE

Governance, Risk, and Compliance (GRC) White Paper

112 BSIMM Activities at a Glance

CA Configuration Automation

Transcription:

White Paper 8 Steps to Measure ADM Vendor Deliverables Ensure Structural Quality with Software Analysis & Measurement As enterprise IT departments increasingly move towards multi-sourcing environments, it is more important than ever to measure ADM deliverables not only to manage risks by ensuring overall structural quality of systems, but also to objectively evaluate vendors and make smarter sourcing decisions. This paper describes the eight steps for integrating Software Analysis & Measurement (SAM) in your outsourcing relationship lifecycle from RFP preparation to contract development, team transition and benchmarking to objectively evaluate the reliability, security, efficiency, maintainability, and size of software deliverables. This measurement can greatly improve the maturity in your outsourcing relationships to enhance performance and reduce risk.

Page 2 I. Introduction II. Contents Transform Structural Quality Review from an Art to a Science III. Leveraging SAM in Outsourcing - 8 Steps IV. Identify the ideal operating scenario V. Select a SAM solution that meets business needs VI. Conclusion I. Introduction To meet high demands from business, systems are becoming increasingly complex and the frequency of change is growing exponentially. As a result, tradeoffs are made when it comes to application structural quality and the inherent risk built into these systems accumulates. And since software is at the core of virtually every business, any breakdown in mission-critical applications can potentially result in hundreds of millions of dollars in losses, not to mention the hit to the company s reputation, goodwill, and credibility with customers and investors. A review of most recent high profile software failures indicates that the root cause of a majority of these failures was poor quality of code. These pressures are further exacerbated by the growing complexity in outsourcing, which is not just about cost savings anymore. Outsourcing partners can bring increased flexibility and expertise on-demand, and by building strategic relationships you can respond to business faster. However, outsourcing also means less technical expertise in-house, and a loss of control over the quality of code being developed and the resources developing the code. This is especially critical in an offshore outsourcing scenario, where lower experience levels combined with high attrition rate can in turn have a compounding effect on the inherent risk that gets accumulated into systems. If unchecked, application software can become ticking time bombs. Most vendor management organizations are becoming more mature and sophisticated in managing outsourcing engagements and they are looking for guidance on measuring vendors in an objective way. Despite many resources detailing how to structure outsourcing SLAs and the related metrics, there is a dearth of information on how to assess and measure the deliverables agreed upon in the SLAs. This paper offers eight ways that Software Analysis & Measurement (SAM) can help to mitigate and manage risk in outsourced applications by measuring the structural quality of vendor deliverables. II. Transform Structural Quality Review from an Art to a Science Source code review comes in two forms manual and automated analysis. Manual source code review is labor-intensive, subjective, and requires highly-skilled software experts. Moreover, it is not possible for a single individual to have the kind of expertise needed to review an application across multiple technologies. Measuring the structural quality of software applications is evolving from an art to a science with the availability of solutions that automate the process of code analysis. Automated analysis provides an objective, in-depth review of the entire codebase, including source code, scripting, and interface languages across all layers of an application, against hundreds of best practices in a fraction of the time it would take for manual analysis.

Page 3 SAM focuses on the structural quality of the entire application (rather than individual components that are typically evaluated by unit tests and code analyzers) evaluating how its architecture adheres to sound principles of software engineering. The Consortium for IT Software Quality (CISQ) has defined the four major structural quality characteristics and a size measure 1 that is needed to evaluate the overall health of an application, and consequently provide business value: Reliability, Efficiency, Security, Maintainability, and (adequate) Size. These characteristics are the primary pillars of evaluation in SAM and can be computed through a qualitative or quantitative scoring scheme, or a mix of both, and then a weighting system reflecting the priorities of a business. Table 1 - Software quality characteristics defined by CISQ 1 III. Leveraging SAM in Outsourcing - 8 Steps SAM is becoming increasingly prevalent in the industry because it not only sheds light on the risks in software deliverables, but can also be used to greatly improve the maturity of an organization throughout the lifecycle of an outsourcing relationship. In this section we will discuss in detail how a SAM solution can add value in each of the eight important steps in an outsourcing engagement, as noted in Figure 1.

Page 4 Highlight SAM is becoming increasingly prevalent in the industry because it not only sheds light on the risks in software deliverables, but can also be used to greatly improve the maturity of an organization throughout the life-cycle of an outsourcing relationship Figure 1 - Leveraging SAM throughout the outsourcing life-cycle 1. Prepare data prior to outsourcing Before transferring applications to an outsourcing partner, it is good practice to perform SAM to: Ensure the availability of objective information that offers a realistic picture of the true quality of the applications by determining baseline quality and size. Make decisions on the best applications for outsourcing or shortlist best outsourcing partners based on the risk indicators. Often in outsourcing relationships, clients indicate that they are unhappy with the quality of the code being delivered without realizing that the application was of poor quality to begin with. You also may want to avoid outsourcing an application with poor quality and inherently high risk as it might further increase the risk. On the contrary, you may want to bring in an outsourcing partner to address specific issues in the application. 2. Include application intelligence in RFPs It is highly recommended that you include SAM outputs in RFP documentation during the bidding process with potential vendors. This information can provide bidders accurate and objective information about:

Page 5 Highlight It is recommended to include SAM outputs in RFPs because, with this level of application intelligence, vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability Technical size (lines of code, number of files, number of tables, etc.) Functional size (Function Points) Technology distribution (% of code that is Java, JSP, XML, SQL,.NET, COBOL, etc.) Complexity (cyclomatic complexity, fan-in, fan-out, etc.) Structural quality metrics (reliability, efficiency, security, maintainability) Architectural blueprint with dependencies between various modules With this level of application intelligence vendors will be able to not only provide more accurate bids, but also ensure that they evaluate the project critically within the context of their capabilities and resource availability. Whenever there is reluctance to share too much dirty laundry with the vendor during pursuit, it s worth pointing out that the problems handed over are not going away. The only impact lack of transparency has on the process is to force the vendor to put more risk padding into the proposal. 3. Get feedback on quality during vendor evaluation As part of the evaluation process, vendors should be asked to provide an assessment of the applications based on the SAM outputs provided. This not only ensures their understanding of the scope of the work and technical aspects, but also of the structural quality of the applications. In addition, part of the technical requirement should be to improve the overall structural quality of the existing code being adopted by them. If appropriate, they should provide a detailed plan and roadmap to improve the quality of the applications in future releases. 4. Reference SAM metrics in initial contract setup (SLAs and acceptance criteria) While it might seem obvious to hold outsourced teams accountable for the intrinsic quality of the product itself, acceptance criteria tied to structural quality have only recently started to show up in SLAs. Only in the last few years has there been an effective way to measure structural quality in a comprehensive way. One of the most important benefits of using SAM in an outsourcing context is to leverage it in contract language and make it part of a Service Level Agreement or Acceptance Criteria. There are primarily three categories of outputs representing a combination of higher-level and lower-level structural quality metrics of software that can be incorporated into SLAs to achieve a specific business need or objective: Quality Indices, Application-Specific Rules, and Productivity.

Page 6 Highlight Data from advanced solutions like CAST AIP provide the number of Function Points that have been modified, added, and deleted in a release can be combined with the development hours spent for a given release to generate productivity measures Quality Indices Quality indices described in Section II (Reliability, Efficiency, Security, and Maintainability) can be used to set high-level goals for overall application health. Ideally, applications should be analyzed for a minimum of two to three releases, and the average scores used as a baseline for each of these health factors. You can then set targets to monitor the overall health of the application over time. Application-Specific Rules The Quality indices provide a macro picture of the structural quality of an application. However, there are often also specific code patterns (rules) that you want to avoid. For example, if the application is already suffering from performance issues, you want to make sure to avoid any code that would further degrade performance. These specific rules should be incorporated into SLAs as Critical Rules with zero tolerance. Productivity SAM solutions provide the size of the code base that is added in a given release. Along with KLOC (kilo lines of code), some advanced solutions like CAST Application Intelligence Platform (AIP) provide data on the number of Function Points that have been modified, added, and deleted in a release. This information can be combined with the development hours spent for a given release to generate productivity measures like KLOC/man-hour or Function Points/man-hour. This is a very relevant metric to track, especially in a multi-vendor scenario, so you can compare charges from different service providers and monitor productivity for each vendor. However, care should be taken to put productivity metrics into context because the amount of time spent on a given release can t be fully derived solely from the actual source code delivered. For example, the configuration tasks related to a software package. It can also take some time to understand the existing code, as it can be quite different from one technology to another, from one architecture to another, or from one team to another. In addition, more often than not, user requirements are rarely finalized and keep changing throughout the project making timelines less productive. Moreover, quite often service providers have their own proprietary packages or components that software analysis solutions are not able to access and are therefore not reflected in quantity-related outputs. Automated Function Points is an objective, critical input that forms part of the overall productivity story. This type of productivity information can be very useful when monitoring an outsourcer, and, when combined with quality outputs and other indicators

Page 7 Table 2 - Sample acceptance criteria for structural quality such as the amount of hours spent, can provide some insights into why a specific release took more man hours per Function Point than other releases, and help identify sources of productivity improvement. SLAs vs. Acceptance Criteria: It is important to determine when to use this type of data in SLAs and when to use it as acceptance criteria. The recommended best practice is to use SAM data as part of acceptance criteria (shown in Table 2) before accepting vendor deliverables for system testing or user acceptance testing (UAT). If the deliverable does not meet predefined criteria, it should not be accepted in order to avoid wasting the time of the testing teams or users. On the other hand, data gathered from analyzing the application before it is put into production can be used to check against SLA performance. Setting Targets for SLAs: When setting targets in SLAs for structural quality of software, it is recommended to collect baseline data first. As previously mentioned, ideally this should be based on the average over two to three prior releases. In the case of

Page 8 Highlight Transitioning code to a vendor team is one of the most difficult parts of an outsourcing engagement. However with software analysis, especially system-level analysis solutions, this process can be made less difficult and more efficient greenfield projects, because there will be no source code to analyze and create a baseline, it is recommended to use industry benchmark data to set the targets to match the scores in the top quartile of the scores for that specific technology. For example, the CAST benchmarking database has data from more than two thousand applications collected from different industries across all technologies. For a more thorough discussion of this topic please see the white paper CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs 2 5. Use documentation created during analysis to ease transition Transitioning code to a vendor team is one of the most difficult parts of an outsourcing engagement, because documentation is typically out of date and the original team that wrote the application may not be available for knowledge transfer. Ask your vendor how they facilitate their teams transition and understanding of the system, identify and monitor the structural hotspots, and perform impact analysis on system changes and movement. Software analysis, especially system-level analysis solutions, can facilitate the transition process. As the code is analyzed, the analyzers reverse engineer the code and create a comprehensive blueprint of the entire application, creating documentation that is current and accurate. This information will not only greatly reduce transition time, but also helps the vendor teams to reduce code duplication and understand system dependencies so they can thoroughly test as new additions are made. 6. Evaluate vendor performance with objective measures Quality performance evaluation is a sensitive subject, so it s important to have an agreed-upon, independent assessment of quality that is repeatable. SAM outputs should be an important part of vendor performance score cards. Clients can evaluate the performance of vendors against SLAs and provide specific guidance with actionable lists (shown in Table 3). 7. Incorporate benchmarking into evaluation Benchmarking of SAM metrics can be very useful to identify opportunities for improvement, and can be done among a group of applications within the organization or with similar technology applications from industry peers. Internal Benchmarking: Automated software analysis provides an objective and consistent measurement that can be used to benchmark different teams within the same vendor or benchmark the performance of different vendors in a multi-sourcing scenario.

Page 9 Table 3 - Sample performance review process Benchmarking allows you to have a meaningful fact-based dialogue with vendors on opportunities for improvement and to measure their progress. External Benchmarking: In addition to internal benchmarking, companies can benchmark the structural quality of their applications to their industry peers. For example, CAST publishes benchmarking information for applications across a wide range of industries and technologies through its Appmarq database. An example of this benchmarking data is shown in Figure 2. Figure 2 - Sample of benchmarking chart for.net application in financial services industry Application Health Score Quality Characteristics

Page 10 Highlight Many organizations create leader boards based on SAM outputs to provide visibility into the ongoing performance of different teams within the same vendor or different vendors in a multi-sourcing scenario Leader Boards: In addition to formal benchmarking, companies can use SAM data to publish leader boards to highlight the applications with the highest structural quality or teams that are most productive. Leader boards can be very effective to motivate teams to improve their performance. 8. Strive for continuous improvement In addition to ensuring the quality of deliverables from vendors, SAM metrics can be used to improve quality on a continuous basis. Most applications have been in existence for several years prior to being outsourced, so there may be some code of poor quality, a lot of copy-pasted code, and several security vulnerabilities. Usually vendors are not responsible for all the issues they have inherited, unless they have been hired exclusively to remedy them. However, companies can use SAM as a way to ensure that in addition to making sure the new code is risk free and is of higher quality, the existing code base can be improved by setting improvement targets that can be revised on an annual basis (see Figure 3). Figure 3 - Sample improvement quality control chart with new targets 4 3.4 3.46% 3.4 3 New Expected Cap New Minimum 3 3.05 1 Months IV. Identify the ideal operating scenario There are many ways to incorporate a SAM solution into your organization for monitoring vendor deliverables. Table 4 provides various scenarios.

Page 11 Table 4 - Different operating scenarios for SAM solution Figure 4 illustrates a typical integration of a SAM solution into the SDLC of a software development organization, as outlined in scenarios 1 and 2 outlined in Table 4 above. Figure 4 - A sample integration of SAM solution into SDLC and IT executive dashboards

Page 12 Highlight Mission-critical applications must be analyzed in the context of the numerous interconnections among code components, databases, middleware, frameworks, and APIs V. Select a SAM solution that meets business needs It is important to understand that there are two broad categories of solutions to measure software structural quality, and that SAM solutions offer a variety of capabilities, ranging from developer-centric tools to enterprise-wide solutions. The first category of solutions measures code quality of individual components, which are language-specific and narrowly focused. The second category of solutions measures application quality. In addition to analyzing the code at component level, they analyze how these components interact with one another across multiple layers (UI, logic and data) and multiple technologies. The exact same piece of code can be safe and of excellent quality or highly dangerous, depending on its interaction with other components. Mission-critical applications must be analyzed in the context of the numerous interconnections among code components, databases, middleware, frameworks, and APIs. This results in a holistic analysis of the structural quality of an application. Figure 5 summarizes the different types of solutions and their uses. In the context of measuring outsourcing vendors, it is recommended to use a comprehensive system-level analysis solution that has the following key attributes: Proactive indication and scoring of risk Historical baselining and trending Consistent measures across teams working on diverse technologies Standards-based measures that can be benchmarked against industry peers Objective, unbiased KPIs Figure 5 - High level comparison of different types of software analysis solutions

References 1. CISQ Specifications for Automated Quality Characteristic Measures. http://it-cisq. org/, CISQ Technical Work Group. (2012) 2. CISQ Guidelines to Incorporate Software Productivity & Structural Quality Metrics in SLAs. http://it-cisq.org/, CISQ. (2012) 3. How to Deliver Resilient, Secure, Efficient, and Easily Changed IT Systems in Line with CISQ Recommendations. http:// www.omg.org/marketing/cisq_compliant_it_systemsv.4-3.pdf, Dr. Richard Mark Soley (2012) 4. The Importance of Application Quality and Its Difference from Code Quality. http:// /resources/document/whitepapers/the-importance-of-application-quality-and-its-difference-fromcode-quality, CAST Software. (2011) 5. Sample Acceptance Criteria with Structural Quality Metrics. http:///sample-sla VI. Conclusion Increasing demand for complex IT projects and the constantly evolving technology landscape means that outsourcing is no longer an option anymore it has become a requirement for any large organization. However, many organizations, in spite of implementing best practices, are struggling to achieve a mutually beneficial relationship with their outsourcing partners. Measuring vendors through automated analysis not only minimizes risks in applications, but also greatly increases the maturity of these outsourcing relationships by properly aligning measurement with the overall business and IT organization objectives. About CAST CAST is a pioneer and world leader in Software Analysis and Measurement, with unique technology resulting from more than $100 million in R&D investment. CAST introduces fact-based transparency into application development and sourcing to transform it into a management discipline. More than 250 companies across all industry sectors and geographies rely on CAST to prevent business disruption while reducing hard IT costs. CAST is an integral part of software delivery and maintenance at the world s leading IT service providers such as IBM and Capgemini. Founded in 1990, CAST is listed on NYSE-Euronext (Euronext: CAS) and serves IT intensive enterprises worldwide with a network of offices in North America, Europe and India. For more information, visit www. castsoftware.com. Questions? Email us at contact@castsoftware.com Europe 3 rue Marcel Allégot 92190 Meudon - France Phone: +33 1 46 90 21 00 North America 373 Park Avenue South New York, NY 10016 Phone:+1 212-871-8330