Performance Testing. What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing



Similar documents
SOFTWARE PERFORMANCE TESTING SERVICE

Application Performance Testing Basics

Recommendations for Performance Benchmarking

Bringing Value to the Organization with Performance Testing

Case Study - I. Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008.

How To Test For Performance

Performance Testing of Java Enterprise Systems

How To Test For Elulla

Oracle Hyperion Financial Management Virtualization Whitepaper

Test Run Analysis Interpretation (AI) Made Easy with OpenLoad

White paper: Unlocking the potential of load testing to maximise ROI and reduce risk.

Performance Testing Process A Whitepaper

Performance Tuning and Optimizing SQL Databases 2016

Developing a Strategic Roadmap Why you need it & how to do it right

Siebel & Portal Performance Testing and Tuning GCP - IT Performance Practice

Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:

27 th March 2015 Istanbul, Turkey. Performance Testing Best Practice

How To Test On The Dsms Application

Qlik UKI Consulting Services Catalogue

Automate performance testing to predict system behaviour and improve application performance. Business white paper

WHAT WE NEED TO START THE PERFORMANCE TESTING?

RFP Attachment C Classifications

Performance Test Process

Why is My Hyperion Application Sick?

Accelerate Testing Cycles With Collaborative Performance Testing

Rapid Bottleneck Identification A Better Way to do Load Testing. An Oracle White Paper June 2009

Rapid Bottleneck Identification

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

A discussion of information integration solutions November Deploying a Center of Excellence for data integration.

PERFORMANCE AND LOAD TESTING

10 Best Practices for Application Performance Testing

Global Delivery Centre:

Business Application Services Testing

PERFORMANCE TESTING. New Batches Info. We are ready to serve Latest Testing Trends, Are you ready to learn.?? START DATE : TIMINGS : DURATION :

Bernie Velivis President, Performax Inc

Performance Test Results Report for the Sled player

An Oracle White Paper February Rapid Bottleneck Identification - A Better Way to do Load Testing

Winning the J2EE Performance Game Presented to: JAVA User Group-Minnesota

Smarter Balanced Assessment Consortium. Recommendation

Automate performance testing to predict system behavior and improve application performance. White paper

Consulting Solutions Disaster Recovery. Yucem Cagdar

Combining Performance Testing and Modelling for easyjet.com

Mike Chyi, Micro Focus Solution Consultant May 12, 2010

Tuning Tableau Server for High Performance

Project Lifecycle Management (PLM)

Performance Testing of a Large Wealth Management Product

EMC Documentum Performance Tips

Monitor and Manage Your MicroStrategy BI Environment Using Enterprise Manager and Health Center

SolovatSoft. Load and Performance Test Plan Sample. Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13

Load Testing Strategy Review When Transitioning to Cloud

Load Testing Analysis Services Gerhard Brückl

"Data Manufacturing: A Test Data Management Solution"

Performance Testing Percy Pari Salas

SQL Server Performance Tuning and Optimization

Project Scorecard Template

Ayurvedic Principles of Siebel Performance

VMware Performance and Capacity Management Accelerator Service

Informatica Data Director Performance

HYBRID APPLICATION PERFORMANCE TESTING

KMS Implementation Roadmap

Migrating Discoverer to OBIEE Lessons Learned. Presented By Presented By Naren Thota Infosemantics, Inc.

Successful Factors for Performance Testing Projects. NaveenKumar Namachivayam - Founder - QAInsights

Noelle A. Stimely Senior Performance Test Engineer. University of California, San Francisco

Load/Stress Test Plan

Copyright 1

What Is Specific in Load Testing?

Cognos8 Deployment Best Practices for Performance/Scalability. Barnaby Cole Practice Lead, Technical Services

Oracle Data Integrator 12c: Integration and Administration

Oracle Data Integrator 11g: Integration and Administration

Quality Assurance - Karthik

Capacity Plan. Template. Version X.x October 11, 2012

Neustar Full-Service Load Testing Seconds delay Seconds delay. 1.0 Seconds delay. When DIY Won t Work.

Microsoft SQL Server: MS Performance Tuning and Optimization Digital

Developing a Risk Based Testing Plan for Enterprise Applications Systems

Cloud-based Managed Services for SAP. Service Catalogue

Project Implementation Process (PIP)

Virtual Desktop Infrastructure Optimization with SysTrack Monitoring Tools and Login VSI Testing Tools

A telecom use case with Cloud Foundry deployment

Oracle Fixed Scope Services Definitions Effective Date: October 14, 2011

Real Application Testing. Fred Louis Oracle Enterprise Architect

Guideline for stresstest Page 1 of 6. Stress test

Dallas IIA Chapter / ISACA N. Texas Chapter. January 7, 2010

Information Resource Management Strategy and Direction

Business Usage Monitoring for Teradata

System Build 2 Test Plan

Software Testing Lifecycle

Performance TesTing expertise in case studies a Q & ing T es T

Performance Testing Uncovered

Data Warehouse (DW) Maturity Assessment Questionnaire

Copyrighted , Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),

Load Testing and Monitoring Web Applications in a Windows Environment

Best Practices for Web Application Load Testing

Introduction. Part I: Finding Bottlenecks when Something s Wrong. Chapter 1: Performance Tuning 3

Databases Going Virtual? Identifying the Best Database Servers for Virtualization

SECTION 4 TESTING & QUALITY CONTROL

Levels of Software Testing. Functional Testing

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.

How To Test A Web Application For Campaign Management On A Web Browser On A Server Farm (Netherlands) On A Large Computer (Nostradio) On An Offline (Nestor) On The Web (Norton

OpenLoad - Rapid Performance Optimization Tools & Techniques for CF Developers

Performance And Scalability In Oracle9i And SQL Server 2000

Transcription:

Performance Testing What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing

What is Performance Testing l The primary goal of Performance Testing is to help ensure that the technology and processes that comprise the system/application will support the day to day, and peak business functions l Performance testing is NOT Quality or Functional testing l We don t care about the math!

Why Performance Test Increased risk/consequences of poor performance Web applications often business critical Stakeholder expectations continue to increase Negative effect on number of visitors, revenues, customer satisfaction, brand How will the system perform under Normal/Realistic production conditions? Extreme production conditions? (i.e. Holiday, Monthly Close) What is the maximum number of users that can be supported? How to set performance expectations for end users? How is performance affected by the concurrent execution of various business processes or different user roles?

Why Performance Test Web applications are becoming more complex What performance bottlenecks can we identify and correct prior to system rollout? How will system hardware / infrastructure perform under load? How scalable is the system? What is the best-case performance of the system (baseline testing)? What performance issues arise after prolonged continuous use (duration testing)? At what point does the system become unacceptable for the end users (threshold testing)?

Performance Test Tool Architecture NeoLoad Presentation Copyright 2009 5

Performance Testing Methodology INIT 1.0 Performance Test Initiation PLAN 2.0 Performance Test Planning INIT 1.1 Assignment of Resources INIT 1.2 Performance Test Charter Development INIT 1.3 Project Kickoff Meeting PLAN 2.1 Gather System Performance Requirements PLAN 2.2 Develop Performance Test Strategy INIT 1.2.1 Set Goals & Objectives INIT 1.2.2 Define Deliverables INIT 1.2.3 Define Roles & Responsibilities INIT 1.2.4 Set Scope & Timelines PLAN 2.1.1 Acquire System Training PLAN 2.1.2 Develop Automation Process Flows PLAN 2.1.3 Map Test Data Requirements PLAN 2.1.4 Set Automation Priority & Schedule PLAN 2.2.1 Document Automation Script Coverage PLAN 2.2.2 Finalize Automation & Test Schedules PLAN 2.2.3 Define Defect Management Process PLAN 2.2.3 Define Metrics and Reporting Structure PREP 3.0 Performance Test Preparation EXEC 4.0 - Performance Test Execution PREP 3.1 Develop Performance Test Environment PREP 4.2 Develop Automated Performance Test Scripts & Scenarios EXEC 4.1 Communicate Execution Schedule & Roles EXEC 4.2 Execute Performance Test Scenario EXEC 4.3 Issue Tracking & Resolution EXEC 4.4 Test Analysis & Reporting EXEC 4.5 Script Maintenance EXEC 4.7 Test Wrap-up & Documentation PREP 3.1.1 Install and Configure Test Machines PREP 3.1.3 Install & Configure Automated Test Tools PREP 3.2.1 Record Automated Support Scripts PREP 3.2.3 Debug and Parameterize Scripts EXEC 4.6 System Tuning & Optimization PREP 3.1.2 Install and Configure System under Test PREP 3.1.4 Install & Configure Test Monitors PREP 3.2.2 Record Performance Test Scripts PREP 3.2.4 Create Automated Test Scenarios Iterative Performance Test Cycles

Methodology - Initiation What kind of resources are needed? Performance Manager and Architect Subject Matter Experts and Developers System, Network and DB Administrators Development of the testing Charter Set the Goals and Objectives Define Deliverables Define Roles & Responsibilities Set Scope & Timelines

Common Resources - Major Roles & Responsibilities Functional and Performance Test Leads - Development and implementation of system test strategies. - Manage test resources, timeframes and deliverables. - Coordinate delivery of all test results. - Facilitate communication with all testing and QA stakeholders. Automated Functional Test Specialists - Execution of enterprise functional test strategy. - Develop and maintain automated functional test scripts. - Work with business SME s to identify common transactions and metrics. - Log and track functional defects. - Work with Development Team to resolve functional defects. Automated Performance Test Specialists - Execution of enterprise performance test strategies. - Develop and maintain automated performance test scripts. - Work with technical SME s to identify common transactions and metrics. - Log and track performance defects. - Work with development team to resolve performance defects. Test Methodology Lead - Ensure the implementation of enterprise best-practice QA standards across all system development efforts. - Provide samples and templates of all QA-related documents. - Provide executive dashboard views of QA progress across all system development efforts. - Act as change control coordinator and manage release strategy for development efforts.

Common Resources - Supporting Roles & Responsibilities Manual Functional Testers - Manual execution of functional tests. - Functional test case development and maintenance. - Log and track functional defects. - Work with Development Team to resolve functional defects. Subject Matter Experts (business and technical) - Provide expertise on business and technical requirements, system usage patterns, common transactions and service level agreements. - Provide walkthroughs of the applications under test. - Verify and signoff on all test scenarios prior to execution. Technical Analysts (Developers, network support, server admins, DBA s etc.) - Work with centralized QA team to analyze and troubleshoot functional and performance defects. - Provide support during major performance tests. Test Management Specialist - Maintain Test Management database (containing all test results, scripts, data and analysis) across all enterprise testing efforts. - Maintain test case and test script standards. - Track and report on defect status and resolution progress across all projects.

Methodology Types of Performance Tests Smoke: This initial round of testing will uncover minor system tuning issues and finalize the configuration of the test environment. This test cycle allows for a clean baseline test, followed by a more efficient round of testing scenarios. Baseline: Determine base-level performance of the application running a single user per process on an isolated test environment. These transaction times will become the standard to which all later tests are compared to in order to determine the system performance under load. Load: Determines the average amount of time to execute a business transaction under average load conditions. This is your customers average experience of a business transaction. (Run a report, Execute a search, etc )

Methodology - Planning Gather system performance requirements Acquire system training Develop automation process flows Define and map test data requirements Define the schedule and priority Development of the Testing strategy Define the defect management process Define metrics and reporting requirements Define business processes to be scripted Define testing types and scenarios to be executed Finalize script and test execution schedules

Methodology Types of Performance Tests Stress: Determines the average amount of time to execute a business transaction under peak activity. This test should mimic peak production load levels as well as peak activity time Threshold: Determines at what load level response time becomes unacceptable and to pinpoint system component breaking points. Load levels will be continually ramped until these thresholds are met. Transaction timings will be continuously measured as the load is increased. Duration: Duration testing will determine the ability of the system to perform under longer periods of normal load conditions. Here we are checking for common system issues that may not be evident in shorter spike type testing such as the threshold testing (i.e. memory leaks). Failover: For system nodes that employ redundancy and load balancing, failover testing analyzes the theoretical failover procedure, and tests and measures the overall failover process and its effects on the end-user.

Methodology - Preparation Prepare Performance Testing Environment Install and prepare performance testing environment Controllers Generators Monitors Is the system/application to be tested ready? Is development complete? Has code been frozen Is functional and User Acceptance testing complete? Need the system to be free of major functional defects before you can performance test

Methodology - Preparation Develop Automated Scripts and Scenarios Record and modify automated business processes Debug the script (session id, cookies) Parameterization (Users, Datasets) User paths, randomized Create and setup testing scenarios What is the goal of the scenario? Average, Peak, Threshold, Duration, Failover How many users? Concurrency (active in the system) Script break down (how many users running a script) What is the ramp up? Users behavior, Think times and pacing, rendezvous points

Methodology Preparation: Risk Assessment A Risk Assessment is first step to building a successful performance test is to identify the key areas of focus. Where do greatest risks lie? Where to focus efforts? What skill sets / resources are required? How to prioritize risks? What mitigation tactics will be employed? Release 3.1 Performance Risk Assessment November 18th, 2005 Technical Risk by Category Batch: Data Movement 39.7% Batch: Data Volumes 15.4% Technology-Specific 23.1% On-Line Performance 12.8% Infrastructure 9.0% 100.0% Technical Risk by Rating - Master Batch Schedule Optimization - Data Input Volumes - Individual Batch Optimization - Month-End Close Optimization - Allocations & Consolidations - Peoplesoft ERP Performance - Batch Impact to On-Line Experience - Informatica Performance - ERP Journal Edit & Post - Portal Performance - Teradata Performance - PeopleSoft nvision Reporting - Web Tier Performance - Infrastructure Scalability - App Tier Performance - SuperGlue - Kalido Performance - Hyperion Disk Capacity Coverage Req's by System Siebel PeopleSoft ERP PeopleSoft EPM PeopleSoft Portal PeopleSoft nvision Reports Informatica Teradata ClearTrust (SSO) Hyperion Essbase Hyperion Planning Hyperion Analyzer Hyperion Reports Hyperion Performance Suite Technology- Specific 23% Online Batch Scalability LOW HIGH HIGH MEDIUM HIGH MEDIUM LOW HIGH MEDIUM HIGH N/A LOW MEDIUM LOW LOW N/A HIGH MEDIUM N/A HIGH LOW HIGH N/A MEDIUM LOW MEDIUM MEDIUM LOW MEDIUM LOW MEDIUM LOW LOW LOW LOW LOW MEDIUM LOW LOW Infrastructure On-Line 9% Performance 13% Batch: Data Volumes 15% Batch: Data Movement 40% Recommended Resource Requirements

Methodology Preparation: Risk Assessment Technology-Specific Risk Analysis Performance Test Risks Risk Level Risk Mitigation Tactics Mitigation Effort Primary Resource Requirements Supporting Resource Req's PeopleSoft EPM Allocations and Consolidations: Process is batch intensive with some on-line activity. Need to measure PeopleSoft-specific performance impacts under production conditions. 4 Benchmarking of Allocations and Consolidations under production conditions including batch impact analysis. Will require automated scripting support. 4 Technical Architect Allocations SME (15%), EPM Technical Architect (20%) Informatica: Need to tune/optimize Informatica processing (memory, cache performance, etc.). Informatica processing varies from 250 Rows/sec. to 10,000 Rows/sec. Staging table space requirements and optimal configuration must also be included in test coverage. 3 Detailed performance monitoring and analysis of Informatica processing under production conditions. 3 Technical Architect Informatica Technical Architect (25%), Informatica DBA (10%) Teradata: Teradata performance must be optimized under production conditions (with expected data volumes, background batch processing, production data movement processes, etc.). 3 Specific performance tuning cycles for Teradata under production conditions. 3 Technical Architect Teradata Technical Architect (25%) PeopleSoft nvison Reporting: Ad-hoc nvision queries and process scheduling can be difficult to optimize and must be included in performance test coverage. 2 Performance test coverage of major nvision queries and process scheduling. 2 Technical Architect PeopleSoft Admin (10%), nvision Architect (15%) Kalido: Past issues with the Kalido data repository (batch process push to IBO) require continued coverage throughout 2.2 performance testing cycles. 1 Detailed performance monitoring and analysis of Kalido processing under production conditions. 1 Technical Architect Kalido Technical Architect (25%) Risk Score: 13 Mitigation Score: 13

Methodology Preparation: Risk Assessment

Methodology - Execution Clearly communicate test execution schedule Assign and Align all resources needed Execute test scenario Active monitoring (OS, DB, CPU, Memory, Errors) Track errors (application logs) Identify and log any errors or defects Analyze and Distribute results Modify scripts accordingly if needed Tune and Optimize system/application Project Wrap-up: Results, Analysis & Recommendation Delivery

Performance Testing Best Practices Plan for multiple types of performance test cycles (smoke tests, load, threshold, duration, failover, etc.) Budget for time to fix and optimize the system after uncovering performance issues. Develop isolated performance test environment that matches production as closely as possible. Proactively involve resources from business, development team, support groups. Create open communication, sense of teamwork with all parties involved. Formalize the final sign-off on performance testing with both business and technical sponsors.

EPM Performance Testing What do we want to know? How long does it take submit data? How long will a report take in a given location? What happens when many locations are submitting financials? How long does the consolidation process take?

Identify Performance Weaknesses

Measure Performance By Location Summerized Key FDM Transaction Response Time by Location 160.00 140.00 Seconds 120.00 100.00 80.00 60.00 40.00 20.00 Logoff Check Export Validate Data Import Login - Upstream 0.00 Melbourne, Australia Nanikon, Switzerland Changzhou, China Giessen, Germany Columbus, Ohio Tampa, Florida

Thank You! Questions? Rob Donahue rob.donahue@roltasolutions.com