New Trends In Application Delivery and Service Virtualization



Similar documents
Performance Testing + Service Virtualization: The one-two punch for applications that rock

Automation and Virtualization, the pillars of Continuous Testing

Develop and test faster by virtualizing constrained application components

Federal Secure Cloud Testing as a Service - TaaS Center of Excellence (CoE) Robert L. Linton

Continuous???? Copyright 2015 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.

HP Service Virtualization (HP SV) Remove delay-causing roadblocks in development and testing

How To Test On An Hp Mobile Device

Service Virtualization:

Bridge Development and Operations for faster delivery of applications

Know the Difference. Unified Functional Testing (UFT) and Lean Functional Testing (LeanFT) from HP

What s new in the HP Functional Testing 11.5 suite Ronit Soen, product marketing John Jeremiah, product marketing

A closer look at HP LoadRunner software

HP ALM Masters 2014 Connected, collaborative mobile application development for the enterprise HP Anywhere

Service Virtualization

Seeking Nirvana. Jason Collins Director Solution Sales ITKO, a CA Technologies company

Collaborating for Quality in Agile Application Development From Beginning to End

Guide to Mobile Testing

The Continuous Delivery Tool Chain: So Many Choices!

On the Edge of Mobility Building a Bridge to Quality October 22, 2013

Measuring end-to-end application performance in an on-demand world. Shajeer Mohammed Enterprise Architect

AppDynamics Fall 14' Release: Revolutionizing APM! p r e s e n t e d b y :

Crossing the DevOps Chasm

HP Application Lifecycle Management

The Worksoft Suite. Automated Business Process Discovery & Validation ENSURING THE SUCCESS OF DIGITAL BUSINESS. Worksoft Differentiators

HP DevOps by Design. Your Readiness for Continuous Innovation Rony Van Hove/ April 2 nd, HP Software: Apps meet Ops 2015

Applications Performance Management for Mobile Applications September 18, 2013

ALM/Quality Center. Software

CA ERwin Data Modeling's Role in the Application Development Lifecycle

Service Virtualization CA LISA introduction. Jim Dugger CA LISA Product Marketing Manager Steve Mazzuca CA LISA Public Sector Alliances Director

White Paper. Software Development Best Practices: Enterprise Code Portal

WHITE PAPER NOVEMBER Eliminate Software Development and Testing Constraints with Service Virtualization

Load and Performance Load Testing. RadView Software October

Whitepaper Performance Testing and Monitoring of Mobile Applications

Delivering Cloud Services Transformation : Plan > Build> Assure> Secure. Stephen Miles Vice President, Solution Sales, APJ

Continuous Application Delivery From concept to reality. Carsten Lentz Sr. Solution strategist

Leveraging the full potential of automation

What is new for HP LoadRunner and Performance Center 11.52

Serena Dimensions CM. Develop your enterprise applications collaboratively securely and efficiently SOLUTION BRIEF

White Paper Software Quality Management

HP Performance Center 11.5: What s New? Gurmeen Aneja

HP Agile Manager What we do

Parasoft and Skytap Deliver 24/7 Access to Complete Test Environments

The ESB and Microsoft BI

Bringing Value to the Organization with Performance Testing

The new ASAP Methodology

The future Cloud. Peter H. Moser, Jr. Manager, Portfolio Architects & Account CTOs

Getting started with API testing

Enhancing The ALM Experience

HP SiteScope software

Be Fast, but be Secure a New Approach to Application Security July 23, 2015

SECTION 4 TESTING & QUALITY CONTROL

Cost effective methods of test environment management. Prabhu Meruga Director - Solution Engineering 16 th July SCQAA Irvine, CA

Business transformation with Hybrid Cloud

How To Test For Performance

Monitoring and Log Management in Hybrid Cloud Environments

VMware on VMware: Private Cloud Case Study Customer Presentation

The Future of Testing: How Service Virtualization Changes the Game in Testing Complex Applications

Client. Applications. Middle Tier. Database. Infrastructure. Leading Vendors

IS EARNED VALUE + AGILE A MATCH MADE IN HEAVEN?

SAP IT Infrastructure Management

HP End User Management software. Enables real-time visibility into application performance and availability. Solution brief

Orchestrated. Release Management. Gain insight and control, eliminate ineffective handoffs, and automate application deployments

Collaborative DevOps Learn the magic of Continuous Delivery. Saurabh Agarwal Product Engineering, DevOps Solutions

An enterprise- grade cloud management platform that enables on- demand, self- service IT operating models for Global 2000 enterprises

Use service virtualization to remove testing bottlenecks

Hybrid Cloud Computing

Application Security Center overview

SAP HANA Cloud Portal Overview and Scenarios

Virtualization and IaaS management

Max Parker IBM Rational Quality Management Specialist. Focus on: Testing and Virtualisation in a complex world

Planning, Provisioning and Deploying Enterprise Clouds with Oracle Enterprise Manager 12c Kevin Patterson, Principal Sales Consultant, Enterprise

<Insert Picture Here> Application Testing Suite Overview

From Traditional Functional Testing to Enabling Continuous Quality in Mobile App Development

IBM SmartCloud Application Performance and Monitoring. RTView for APM Webinar

Testhouse Training Portfolio

Performance Testing and Optimization in Web-Service Based Applications

White Paper. Cloud Performance Testing

HP ALM11 & MS VS/TFS2010

Table of contents. Enterprise Resource Planning (ERP) functional testing best practices: Ten steps to ERP systems reliability

Agile Software Factory: Bringing the reliability of a manufacturing line to software development

Application Performance Management (APM) Inspire Your Users With Every App Transaction. Anand Akela CA

KMS Implementation Roadmap

SUCCESSFULLY INTEGRATING AGILE WITH EARNED VALUE MANAGEMENT

WebSphere Integration Solutions. IBM Day Minsk Anton Litvinov WebSphere Connectivity Professional Central Eastern Europe

HP APPLICATION PERFORMANCE MONITORING

Independent process platform

Transcription:

New Trends In Application Delivery and Service Virtualization By: Ferhan Kilical, Ph.D. Senior Product Marketing Manager, HP, May 29, 2013 South Africa

A new world is emerging Software as innovation A world without borders A new class of user Build, test and deliver today s user centric applications faster than ever before New business and IT delivery models Faster innovation cycles Increased mobility and accessibility Socially-connected consumers Vast, rich data sets 2

We are working with a radically different kind of user Mainframe Client/Server Web Devices System-centric User-centric Users born after 1980 3

And driving key trends Modern systems of engagement for this new type of user By 2016, integration projects that include on-premises applications and cloud services 50% Composite By 2015, mobile application development projects will outnumber native PC projects 4 to 1 Mobile By 2016, organizations with joint App Dev and Ops initiatives for continuous delivery and simplified release management 40% Agile Source: 4 Copyright Gartner 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.

Keys to enable Velocity Eliminating the sources of latency in a mobile, agile, composite IT landscape Visibility and Collaboration Drive real-time decisions Optimize work streams Immediate response to change Virtualization Build an always-on lab Provide constant access Represent external factors Automation Lightning fast execution Automated regression Configuration as code 5

But the way we build is changing Deliver what the business wants, when it wants it Requirements Develop Test Deploy Yesterday 4 months Exhaustive definition Abstract Contractual Today/tomorrow Manual configurations & stubs Driven top-down PC-based applications Test only; code=black box Lead time for environments Treated as last mile Manual deployment Wastage of assets: performance scripts, known bugs, etc. 1 week Just enough Experiential Story-based / interpretive Composite & virtualized Automatic connections Multi-channel apps Insight into code changes Auto deploys for dev/test Continual testing Automated deployment Asset reuse between Apps & Ops 6

Building Business Software has changed. Technology Delivery Reach Infrastructure Change Composite applications Web 2.0 Mobile, tablets, app store Social media Hybrid and Cloud 7

But We re Not Keeping Up the Costs of Quality Keep Increasing 30% Typical cost of testing in a development project 56% 82% #1 100X % of defects introduced at the requirements phase Amount of effort required to fix poor requirements #1 cause of IT waste is poor defect mngt and rework Cost to repair a defect in production vs. requirements 8

HP s approach to ALM Seamlessly deliver systems of engagement App. Portfolio Mgmt. Project & Portfolio Mgmt. Complete lifecycle coverage Svc. Mgmt. Center Systinet Core lifecycle excellence Project & Portfolio Management Application Governance Executive Scorecard Business Availability Service Management DevOps Ops Orchestration & Auto. App. Perform. Mgmt. Executive Scorecard Perform. Center Fortify Quality Center Agile Management Requirements Quality & Performance IDE, SCCM & Build Integration Security Lab Management 9

There is a fundamental tension with modern application delivery Goal: Deliver Faster Agile s momentum Dev drives the agenda Speed is the answer Result: Quality bottleneck? More changes More moving parts Included too late: Need-test earlier Need- align with dev Driven by: Customer Pull Business demands Competitive pressure Increasing expectations New features, performance, security 10

What does a Modern App look like? Modern Applications Hybrid Composite App Database API and Services User Interface Network 11

Set the stage what does a Modern App look like? Modern Applications Hybrid Composite App Database Cloud Apps/Services API and Services User Interface Network End2End Integration 3 rd Party Legacy/Internal 12

Set the stage what does a Modern App look like? Modern Applications Cloud Apps/Services Traditional Client Hybrid Composite App Database Web Mobile Multi-channel customer engagement API and Services User Interface Network End2End Integration 3 rd Party Legacy/Internal 13

Set the stage what does a Modern App look like? Modern Applications Cloud Apps/Services Traditional Client Hybrid Composite App Database Web Mobile Multi-channel customer engagement API and Services User Interface Network Quality Time-to-Market Functionality/Requirements End User Experience Performance Reliability Security End2End Integration 3 rd Party Legacy/Internal 14

The New Style IT Is Transforming Future of Military Applications! 15

Test is crucial! 16

We have to support the troops! Testing and service virtualization are important in applications supporting the war fighter These applications provide end-to-end capability to manage and monitor personnel and equipment through the mobilization process with logistics-related information ensuring correct Personnel Equipment Supplies Support Available from a single network and workstation with single sign-on (SSO) to all applications and across all functional areas. Family of applications global, near real-time, accurate, integrated information provided through a robust and reliable communications infrastructure. 17

Globally distributed teams, globally distributed projects Ops Weather Ops Ops Apps Imagery Apps Infrastr Ops Ops Apps Ops Apps Ops Apps Ops Apps Ops Training &Simulation 18

Num. Alpha. WBS Element Name WBS Element Definition & BOE Text Functional Org. Program CAM Responsible SOW Description CLIN OBS Level WBSE WBSE (Data Dictionary) Responsible 0 0 DCGS Moble Basic Summary Level Summary Level Program Manager Program Management Summary Level Summary Level Richard Weinberger Program Management 13AA Richard Weinberger 2 AA Program Management Richard Weinberger PM, Deputy & Engineering Mgr The Contractor shall provide a dedicated Program Manager, Richard ID Task Weinberger Name % Complete Start Finish Deputy Program manager and Engineering Manager to track and manage the duties and responsibilities of personnel assigned to the 1 DCGS-A Mobile Long Term Schedule 15% Thu 3/5/09 Fri 12/16/11 Program Management 13AA Richard Weinberger 3 AAA program. 2 Milestones 50% Thu 3/5/09 Fri 12/16/11 Business Management The contractor will provide support for the following activities: 934 Richard Weinberger Program Management 12% Fri 4/17/09 Thu 12/15/11 935 PM Office Support (3 months) 100% Fri 4/17/09 Wed 7/15/09 Earned Value Management, including monthly CPRs, Formats 1 1007 PM Office Support (27 months) 2% Fri 7/17/09 Fri 10/14/11 & 5 AAB 1080 PM Deliverables 12% Fri 4/17/09 Wed 9/28/11 Contract Administration 1214 Teammate PMRs 13% Tue 6/16/09 Fri 7/22/11 Finance Administration, including weekly actuals by WBS 1242 Risk and Opportunity Review Boards 13% Fri 4/17/09 Mon 9/12/11 Business Mgmt Administration 1405 Program CCB Meetings (Monthly) 7% Wed 6/10/09 Wed 9/7/11 Business Management 13AA Richard Weinberger 3 Scheduling 1434 Mission Assurance Process and Plan Compliance 0% Tue 10/27/09 Wed 4/13/11 SI Team A programmatic and technical team set-up to translate Richard Weinberger 1447 Integrated Baseline Review 12% Fri 4/17/09 Fri 11/13/09 program/technical requirements to the Industry Team through 1483 CDRL Deliverable Review Cycle 10% Fri 4/17/09 Thu 12/15/11 specific contractual direction. This team will monitor cost, schedule and performance of the teammates, identify and resolve 4127 Engineering 12% Mon 3/30/09 Fri 12/2/11 AAJ risks, and identify and resolve problems. This team will 4128 Prime Subcontractor Engrg Data Item Deliverables (Internal Doc Due Dates) 14% Fri 4/17/09 Thu 10/13/11 communicate top level program direction and guidance to the 4189 Systems Engineering 18% Mon 3/30/09 Wed 11/23/11 entire industry team. There is a close working relationship 6319 SW Engineering 12% Mon 3/30/09 Thu 11/3/11 between the SI team and Subcontract Management. 9364 Integrated Equipment 24% Fri 4/17/09 Tue 11/15/11 Business Management 13AA Richard Weinberger 3 9728 Integration and Test 10% Mon 4/6/09 Fri 12/2/11 Business Management 13AA Richard Weinberger 3 Program Security 13AA Richard Weinberger 3 Risk Management 13AA Richard Weinberger 3 ABD ABE ABF Subcontract Management (NGES-SI) Task includes all Supply Chain Management / Subcontract Activities associated with: Preparation, Negotiation, Placement and Management of all NG and Teammate Purchase Orders, as well as all procurement activities that relate to Software, Systems, Integration and Test and ILS support. Security Administration Risk Management The Contractor shall perform security administration for all required aspects of personnel and physical security associated with this CLIN of the MIGS contract. (NGES-SI, Teammates) The Contractor shall implement a Risk Management program to identify, assess, and manage risks associated with the Contractor's products or services. The Contractor shall maintain an accurate risk list, assess the impacts of risks, to a quantitative level when applicable, develop and implement mitigation plans for high and medium risks, with closure criteria for all risks, and report and coordinate risks with the Customer in achieving program objectives. The management of the risks shall be fully integrated with the other program planning and control elements and will be managed per decisions at monthly tiered risk and opportunity review boards. The Contractor shall participate in monthly IPT risk reviews and support risk reviews at the PM level if required, or more frequently as deemed necessary, to ensure risk management practices are well understood and reviewed by the program. Risk reporting shall conform to the risk management templates provided to allow the risks to be reviewed, including closure criteria, and for new potential risks to be identified as needed. 11715 ILS Engineering 5% Fri 4/17/09 Fri 10/21/11 13609 Richard Weinberger Mission Execution (CA) 27% Fri 4/17/09 Fri 9/30/11 13701 System Asset Schedule 8% Mon 3/30/09 Wed 11/23/11 13702 Mobile Basic #1 13% Fri 4/17/09 Wed 11/23/11 13758 Mobile Basic #2 7% Mon 3/30/09 Fri 11/4/11 13791 Mobile Basic #3 0% Mon 7/27/09 Tue 11/1/11 13817 BCT SW Builds 30% Mon 3/30/09 Tue 9/27/11 Richard Weinberger 13818 BCT SW Build 1 80% Mon 3/30/09 Wed 11/11/09 14100 BCT SW Build 2 45% Fri 4/17/09 Thu 4/8/10 14739 BCT SW Build 3 32% Fri 4/17/09 Mon 6/14/10 17240 BCT SW Build 4 27% Fri 4/17/09 Tue 8/10/10 19267 Richard Weinberger BCT SW Build 5 26% Fri 4/17/09 Wed 9/22/10 20264 BCT SW Build 6 23% Fri 4/17/09 Fri 10/22/10 20741 BCT SW Build 7 0% Tue 6/1/10 Mon 9/27/10 20854 Post BCT Follow 0% Mon 9/27/10 Tue 9/27/11 20858 EDR / FQT / AIC / LUT 8% Fri 4/17/09 Wed 11/23/11 20859 EDR 0% Fri 7/2/10 Fri 12/3/10 20863 QA Dry Run 0% Fri 12/3/10 Mon 1/3/11 20864 TRR 0% Mon 1/3/11 Wed 1/5/11 20865 FQT 0% Wed 1/5/11 Wed 5/25/11 20873 AIC 0% Thu 9/9/10 Wed 3/30/11 20878 LUT 10% Fri 4/17/09 Wed 11/23/11 20894 AT - BCT 1 0% Thu 12/2/10 Tue 5/17/11 20900 AT - BCT #3 0% Mon 10/18/10 Mon 3/28/11 2009 2010 2011 2012 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M 15% 100% 12% 80% 45% 32% 27% 26% 0% 23% 0% 0% 0% 0% 0% 0% 0% 0% 13% 2% 12% 13% 7% 14% 50% 12% 10% 12% 18% 12% 5% 27% 30% 0% 24% 7% 0% 10% 8% 13% 8% 10% Very sophisticated project schedules, dependencies, composite applications, development, test and architecture over firewalls Objective successful launch proves value to soldier Tasks defined... 13AA Richard Weinberger 1 1 A 13AA Richard Weinberger 2 AB Program Office Support Richard Weinberger Resource loaded WBS Schedule in place Detailed IMS Cost control established... Sum of BCWS CAM WB A - Richard Weinberger B - Ken Pettit C - Jim Kuhl D - Will Dax E - Greg Hodges F - Al Glusick Grand Total _U 2,597,013 2,597,013 AA 30,700,640 30,700,640 AB 12,673,269 12,673,269 BA 7,884,886 7,884,886 BB 2,033,239 2,033,239 CA 38,328,220 38,328,220 CB 1,138,800 1,138,800 DA 3,476,976 3,476,976 DB 15,317,714 15,317,714 EA 4,180,404 4,180,404 EB 10,906,252 10,906,252 FA 7,490,945 7,490,945 FB 8,789,068 8,789,068 Grand 33,112,347 267,322,430 Total 45,970,922 23,692,566 95,957,171 18,794,689 49,794,734 Deliverables identified... Equipment Software Documentation ILS A lot of dependencies for composite apps. SLAs must be met! 19

SLAs and architecture drive performance needs in composite apps with GIS services Critical Path Defined Critical Path Defined 20

Successful launch requires performance testing be done from the beginning Challenges: Composite application had many services needing integration to development and test. Many interdependencies, all belonging to different contractors. Some of them came from Forge.mil, some of them were ozone widgets. Some of them needed to be developed. Schedules were not always in synch Major need for hardware optimization supporting classified and unclassified environments for 5 different enclaves. Set of security related policies and appliances we needed to learn Lack of communication between teams and contractors Keeping track of deliverables, tests and test results Frequent human errors setting test and dev environments leading to schedule slips. 21

Also, performance test results needed to be shared Test data was a major issue Data had to be shared by multiple teams (dev, test and infrastructure) + by multiple contractors Due to the nature of services -those that were GIS and weather related it was a major challenge to get the right data to the test environment There were services related to the back end and we did not have enough information on them The schedule was based on continuous delivery Test had to be done continuously and had to be automated SLAs needed to be met due to the nature of the applications It was based on composite apps and Family of Systems that military uses 22

In one iteration: smoke and regression tests Total number of test cases Smoke 87 per browser Regression 259 per browser Patch approx 15 per browser Total number of releases 18 so far, with additional releases to some rounds Days and resources to test Smoke - 3.5 hours per browser (uninterrupted), usually 2 people Regression - 16 hours per browser (uninterrupted), usually 2 people Patch - 4 hours per browser (uninterrupted), usually 1 person Number of cycles before ATRR Herndon Government suite A Government suite B Hours of smoke, patch and regression Herndon 4 days per release Government - suite A 3 days per release 23

Performance test scripts show actual response-time and monitor the servers involved 24

LoadRunner Controller Monitors and SiteScope SiteScope Monitors: CPU Utilization on Portal Memory on Portal CPU Utilization on SIDE Memory on SIDE CPU Utilization on SAFE CPU Utilization on WMS/WFS Memory on WMS/WFS Loadrunner Monitors Hits per Second Throughput Trans Response Time, etc. Oracle, Web servers, etc Back end data verification We did not have the app ready! 25

The case of end to end performance testing Challenges Composite applications Incomplete automated business processes Non representative performance scenarios or inaccurate load modeling Restricted capacity planning or impact analysis Cost Dependencies which may result in delays Risk based performance testing with a potential negative impact to quality 26

Map service interface description The application has three mapping interfaces: Map Server Reporting Detail Request from Sub Reporting Mapping Call Sub Application Map server request Map server response HTTP request HTTP response 27 27

Reporting detail request from the Sub The user may drill down within the Mapping interface on certain items in this application layers to get additional detailed information. User Sub ESB Application User Selects a point in the layer GetFeatureInfo Information about point Displayed in map interface FeatureInfo 28

End to end performance testing General Framework Virtual Service 1 Load Generators Application Under Test Virtual Services Management SV Server SV Server Virtual Service 2 Virtual Service 3 Learn/Simulate/Standby Simulation Modeling VS Performance Characteristics VS Data Characteristics Real Services (WS, Mainframe, MQ, Tibco.) 29

Why the issues? Software components not fully developed Components/services with limited or gated access Dependency on third party costs Data too difficult to source Security and compliance restrict access REST LDAP Third Party Single sign on MQ Mainframe Web browser JDBC Existing database Mobile App Composite Application Application services SOAP JMS Web service and Legacy application Application Under Test Existing Infrastructure 30

Service virtualization key infrastructure! Keep developing and testing moving forward with virtualized services Virtualize for always available services Virtualize data scenarios Virtualize performance scenarios (simulate response time variance) Make widely available for Dev and Test Share services lower infrastructure costs Pay-per-transaction REST Third Party LDAP Single sign on MQ Mainframe Web browser SOAP REST JMS MQ Data Perf. Config JDBC Existing database Mobile App Composite Application Application Simulation services SOAP Under JMS construction Web service and Legacy application 31 Application Under Test Service Virtualization Existing Infrastructure

Implicit integration with HP Products Select the right performance and data models Virtual services can run on dedicated servers for specific performance tests 32

Embedded virtual service monitoring framework Virtual services specific metrics & graphs 33

HP SV and Performance Center supports continuous testing Define AUT Topology Schedule Nightly Test Set Run Nightly Dev Build Deploy AUT Topology Build verification testing Generate Report Run Test Set 34 34

Contractor Integration Results Contractor Integration Testing (CIT) - Date Purpose: integrate and test all system components prior to official delivery to the government. Application documentation has been completed and is up to date. Successful completion of smoke, patch, and regression tests. All the test results were delivered into government CM. All defects are documented in CM tool. Final system test report submitted to the PMO. Installation and build guide with all the updates completed and delivered. Results from test event indicated that the software is mature for SAT 35

Requirements validation/regression testing results Requirements Validation Purpose: validate requirements targeted for the release by executing a sample of the test cases for each capability. The purpose of the regression testing effort is to identify any issues that may have been introduced as a result of any baseline changes. Release Requirements testing Completed 98.13% testing of all testable requirements Executed 70.63% testing against IE 6.0 Executed 27.50% testing against IE 7.0 Remaining 1.86% of testing could not be functionally tested Regression testing Completed 100% of planned regression testing Executed 74.05% testing against IE 6.0 Executed 25.95% testing against IE 7.0 36

Apps are fielded Service Virtualization replaced the Testing Center s previous manually built responder framework and eliminated the problems for shared resources and significantly reduced testing time in the environment across different teams and multiple contractors. Performance testing activities can now be conducted against a readily configured virtual test environment that is available 24/7, without conflicting with other team s test systems and data. This provides Company with significant cost savings and more consistent test results. Now apps that support war fighter are successfully launched and fielded in major USA Bases by using HP Test Suite which includes HP Quality Center, Performance Center, Service Test and Service Virtualization. 37

Why? Market trends According to VOKE Research 51% of organizations identified regular and frequent delays in testing due to applications dependencies. Additionally 67% of organizations reported having access to 50% or less to the systems required to do their testing. 96% have to wait to access the systems needed 58% need to schedule their access whenever access is available but having to share it with other groups 38% still incur delays between 2-4 days whenever access is even scheduled Nearly ALL QA organizations are unable to provide 100% test coverage due to dependencies on application interfaces or third party systems/apps, in functional and performance testing 38

Service Virtualization Friction-free dev / test and Ops Virtualize. Remove constraints. Move forward. Deliver high quality faster. Virtualize services just from design and data Record once, use anywhere by dev and test Remove dependency on costly cloud services Mitigate production, data and privacy concerns Integrated with ALM/QC Virtual Service Management Virtual lab with virtual services We can test earlier without need for end-systems, we speed up our release cycles, and reduce blocking issues, eliminating third party dependencies --European Telecommunications Company 39

Service Virtualization 3.0 Eliminate impediments for friction-free development and testing Virtualize for velocity of app delivery Faster app time-to-market Virtualize often constrained enterprise business applications with new SAP protocol support Location independent ease-of-use New web-based virtual service administration crosses distributed team boundaries for rapid implementation 40 Reduced dev/test cycle time Rapid publishing, access and provisioning of virtual services with new integration to HP ALM and HP QC We can test earlier without need for endsystems, we speed up our release cycles, and reduce blocking issues, eliminating third party dependencies --European Telecommunications Company

Want to learn more? www.hp.com/go/sv Engage in our ALM community Attend a techtalk webcast Tweet us @HPsoftwareALM Contact Ferhan Kilical Ferhan.kilical@hp.com 41

Q&A Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.

Thank you Copyright 2012 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.