BIO PRESENTATION T13 10/19/2006 1:30:00 PM TESTING SOA SOFTWARE: THE HEADLESS DILEMMA John Michelsen itko, Inc. International Conference on Software Testing Analysis and Review October 16-20, 2006 Anaheim, CA USA
John Michelsen John Michelsen is Chief Scientist and a co-founder of itko, Inc., an automated software ing company. John has more than 15 years of high-level enterprise development experience, as a chief architect of development teams and as an executive in designing, developing, and managing large-scale, object-oriented solutions in traditional and network architectures. He is the chief architect of itko's LISA automated ing product and a leading industry advocate for software quality. Before forming itko, Michelsen was Director of Development at Trilogy Inc., and VP of Development at AGENCY.COM. He has been titled Chief Technical Architect at companies like Raima, Sabre, and Xerox while performing as a consultant. Through work with clients like Cendant Financial, Microsoft, American Airlines, Union Pacific and Nielsen Market Research, John has deployed solutions using technologies from the mainframe to the handheld device. Over the years, John and his teams have enjoyed tremendous delivery success. At Sabre, Michelsen served as the Chief Technical Architect of the first real-time Internet reservations application. He served as lead architect for the www.enrononline.com trading marketplace development which processed over a billion dollars of trading activity daily. John consulted Microsoft to help establish itself in the enterprise software market. These are just a few of the client solutions and relationships which are still in operation today -- a ament to his technology vision and customer commitment.
Testing SOA The Headless Dilemma John Michelsen STARwest 2006 October 19, 2006 2006, itko, Inc. All rights reserved.
Agenda SOA development and QA trends SOA quality best practices - the Three C s Headless Testing Fundamentals Tricks & Treats (Case Studies) Q & A
Who are we? Entered the market to deliver an SOA ing solution, LISA Current version 3.5 (2006) Key value propositions: Increase Test Coverage Lower Cost per Test Faster Discovery & Resolution Everyone should own quality. A History of Quality Leadership 2004 LISA 2.0 2003 LISA 1.0 Released to Market 2001 LISA under development (4) Patents Pending 1999 itko Incorporates (Privately Held) 1999. 2001. 2003. 2004. 2005. 2006. 2006 LISA 3.0
Platform Evolution Causes Testing Evolution 3270/CICS data stream ing User-Interface oriented ing Component, heterogeneous, distributed ing Mainframe Client Based SOA/ Composite Apps 1980... 1985... 1990... 1995... 2000... 2005... 2010... 2015
The Promise of SOA Reduce integration cost Through loosely-coupled forms of integration and industry standards Increase asset reuse Build new business workflows from existing Services to form composite applications Increase business agility Better control business process definition and management to meet customer needs Reduce business risk Governance, compliance, and risk reduction through increased business visibility and agility
before SOA Standard enterprise apps Browser UIs Client UIs Server (database) You had a standard delivery platform. And a standard infrastructure.
SOA promises a composite workflow Division 1 Transaction service Transaction provider Partner Reseller Partner Your extended enterprise Business rules Reseller Order management BI tools SOA was supposed to simplify extending workflows. Core app Outsourced supplier Data warehouse Division 2 Customer buyer Customer buyer
But heterogeneous technology is complex Transaction service Division 1 Your Company Channel Partner Standards and components are still evolving Financials Mainframe Legacy Data Legacy App SOAP objects Outsourced firm Business Rules BI tools Legacy App Your App Workflow Content Database Messaging service Web interface Composite apps have multiple owners ESB.NET Ordering Service Customer company It s Continuously Changing Data warehouse File System Division2 Web Services RMI objects CRM Web App
Business View of SOA Management Composite App Deployment and Monitoring Corporate Governance Policies Rules Composite App Development, Validation & Support Overall SOA Infrastructure Validation Service Level Validation Issues & Resolution Platform Infrastructure component providers Application component providers Internal component providers It s a Migration, Not a big bang Business Logic is mostly in the middle tier Multiple teams need to collaborate to ensure Business, Technical and Regulatory compliance
Best Practices: Three C s of SOA Testing Complete Testing (Breadth) Every heterogeneous layer of architecture Test UI verify in system of record Reuse functional for performance ing Collaborative Testing (Scale) Test early before UIs are created Not just dev, business analysts and QA should verify processes Continuously Test (Depth) Regress on existing functionality Add ing of new services to existing ing workflows
Complete: SOA Testing across every layer Web UI Swing ASP.NET Presentation layer Process and services layer BPM CORBA J2EE.NET WS ESB RMI MQ SOAP/XML Integration layer Legacy App Database File System Custom API Data / Applications
Collaborative: SOA Requires Agile Business Analysts outline a business process and not implemented cases Developers unit and jump-start QA Complete Test Coverage Lower Cost per Test Faster Discovery & Resolution Support rapidly debug issues and communicates appropriately QA expands dev s and create functional and load s Production their implementation and report issues to support
When Agile Isn t Collaborative Testing Development is trying to move to an iterative process but the rest of the team is not in the loop! Requirements Design Develop Test Deploy Monitor 100% complete QA Dev Business Release Deadline 0% complete
Collaborative Testing: the Agile Team When everyone owns quality, not just dev, continuous ing happens With LISA, the business gets reliable value from software at a lower cost. 100% complete Deadline 0% complete
Continuous: SOA is never done Project-based development had a phase Database Create components Unit Integration Create UI Acceptance Test Deploy App SOA development is constantly evolving Create UI Database Create UI Service retired New service Service changed Create UI New service Database New service
Continuous: SOA is consumed at runtime Registry MDM Transactions APP 1 APP 2 Integration Layer (ESB) APP 3 APP 4 Database Internal Service Legacy App Partner Service
Continuous: Unintended Consequences Registry MDM Transactions APP 1???? APP 2 Integration Layer (ESB)? APP 3? APP 4 Database Internal Service Legacy App Partner Service????
Continuous Testing is A Requirement Registry MDM Transactions APP 1 APP 2 Integration Layer (ESB) APP 3 APP 4 Database Internal Service Legacy App Partner Service
So how do we defeat the Headless dilemma?
Why UI Testing will never defeat Headless Apps Dynamic cases sample contextual properties of data communicated to the browser, and can validate the source of the data directly. <function Inventory item> Test run 1 Client ing records the screen coordinates of a piece of data on the user interface. <text string= HAW3205(3) > <screen location= x135 y489 > <style button> <function buy Now > <text string= Buy Now > <screen location= x658 y540 > If dynamic data changes the screen layout, the dynamic text can maintain the context of those properties and still validate the logic. For instance, on Test 2, more rows of dynamic results have been returned, but the can still determine that the Buy Now button is in the appropriate spot. Test run 2 When the dynamic UI returns more resulting rows, the screen coordinates of objects changes, causing the to lose context and usefulness. <text string= HAW3205(3) > <screen location= x135 y489 > <style button> <function buy Now > <GO TO NEXT STEP> <text string= Buy Now > <screen location NOT x658 y540 > <FAIL: ITEM NOT IN LOCATION!>
Headless Testing Fundamentals Complete Testing Every heterogeneous layer of architecture Test UI verify in system of record Reuse functional for performance ing Collaborative Testing Test early before UI s are created Not just dev, business analysts and QA should verify processes Continuously Test Trap for Unintended Consequences Add ing of new services to existing ing workflows
Complete SOA Quality Platform All Technologies Standard SOA Components Web apps, SOAP, Databases, J2EE, ASP.NET, JMS/MQ, RMI, Java, Files, etc. Custom Extensions LISA Extensibility Framework LISA Test Harness API Quality Assurance Processes Create and Execute Tests Workflow, Events, State Management, Nodes, Assertions, Filters, Reporting, Integration, Metrics Requirements Design & Modeling Developer Testing Construction Deployment Production Monitoring Development Application Lifecycle Management (ALM) Process Collaboration tools, SCM, Requirements Management, Issue Tracking, etc.
Complete: Why Middle-Tier Matters Database/ mainframe = 1 change Middle-tier level = 100x UI level =1000x Few changes at the database level, but multiplies issues downstream For every middle tier change, there are hundreds of front-end impacts possible at the UI layer Limitless front end options Any client, multiple customers Mix of technologies XML/RPC, Swing, AWT, JS, DOM, now AJAX Services consume business logic
Collaboration Across Teams and Lifecycles unit s functional regression load/ performance monitoring Team1 (internal) Team2 (Other dept.) Team3 (partner) Overall SOA Quality
Continuous How can you the workflow? Transaction service Division 1 Your Company Channel Partner Financials Mainframe Legacy Data Legacy App SOAP objects Outsourced firm Messaging service Business Rules BI tools Legacy App Your App Workflow Content Database Web interface ESB.NET Ordering Service Customer company Division2 Data warehouse File System Web Services RMI objects CRM Web App
Continuous: Gain Transparency Registry MDM Transactions APP 1 APP 2 Integration Layer (ESB) APP 3 APP 4 Database Internal Service Legacy App Partner Service
Real Tricks: Case Studies Industry Pain Points Benefits with LISA Leader in Financial Services Fortune 100 Software Company Lack of ing tool for middle-tier (EJB, Tuxedo based services) Test business logic of web frontend by screen scrapping another web application Different tool for unit, functional, integration, and load ing Lack of a single automation harness to manage over 20 projects. Need for a single collaborative ing tool for a geographically distributed team (US, China) Provides a seamless, holistic platform to all the tiers of a composite application. Uses point-and-click, codeless authoring environment to define business validation Single platform for complete ing Extend harnesses to support custom apps in heterogeneous architecture Shares ing assets to promote reuse. Leader in e- Commerce Software Time to discovery & resolution very high for customer support No tool to validate success of migration Needed accountability for support issues Made support process very responsive and reduced the cost by 50% Assists in migration to perform filesystem based s along with core functional s
The Three Dimensions of Quality maturity Collaborative > Dev/QA silos everyone s, every phase Scope of ing Depth of ing Continuous > Phased s constant ing Complete > single component all components Breadth of ing
What are your quality goals? Complete What technologies are you currently ing? Are you directly ing them? To what depth? What other technologies need coverage? Are we ing integration? Collaborative Who s? At what project phase? Are ing processes shared? Are there resource bottlenecks? Continuous? Do we monitor performance and workflow level s? How automated is reporting and resolution? How do we simulate what if scenarios? Collaborative everyone s, every phase target current Complete all components Continuous constant ing
Summary Must heterogeneous, composite SOA applications Test every layer of a complete architecture (UI, Middle Tier, and Back End), not just the client Establish a collaborative platform for development, functional, load, and production ing Continuous ing of interdependent, evolving SOA systems before and after deployment Extensibility to support and instrument the most complex environments Complete composite SOA application ing solution
About itko LISA LISA is Complete - Test heterogeneous SOA LISA can every tier of an application, regardless of location LISA can every service technology with one tool, one LISA s framework approach creates bridges to legacy services that means you can get there from here LISA is Collaborative Everyone owns quality LISA shines at ing all components before UI even exists, as well as integrating with the team s ALM lifecycle and process Unit, regression, system, load, and monitoring in one tool Leverages diverse skill sets in Dev, QA/QE, and BA roles LISA is Continuous Enables Agile SOA Tests constantly evolving, interdependent systems during and after deployment Traps for the what-if unintended system-wide consequences of making changes or corrections to components.
Thank you for your interest! Need some headroom? Questions? info@itko.com???