What Is Specific in Load Testing?
|
|
|
- Cory O’Neal’
- 10 years ago
- Views:
Transcription
1 What Is Specific in Load Testing? Testing of multi-user applications under realistic and stress loads is really the only way to ensure appropriate performance and reliability in production. Load testing is emerging as an engineering discipline of its own, based on classic functional testing from one side, and system performance analysis and capacity planning from another side. The terminology is still vague in this field and the borders are fuzzy. From a practical point of view, we probably can break down all kinds of testing into two big classes: with a multi-user load (load, performance, stress, volume, etc.) or without it (functional, regression, etc.). The different terms inside each of these classes specify why we are testing and for what result we are looking rather than how we do it. There is no precise definition for each term; the exact meaning can differ from source to source. For example, performance testing could mean that we are more interested in response time; load testing could mean that we want to see the system behavior under a specified load, and stress testing could mean that we want to find the system break point. We still do the same things: apply a multi-user load and get some measurements. The difference is only in the details: what precise load we apply and what measurements are more important to us. As all these terms reflect goals of testing, these two classes do not match the terms completely: we can do performance testing for one user measuring response times with a stopwatch, or test the functionality of the system while having a 100-user load in the background. Moreover, performance is a part of functionality in some sense. We would still refer to testing under multi-user workload as load or performance testing here and contrast it to classical one-user functional testing. Both classes are considered testing and have a lot in common. Still, load testing has significant specifics and requires some special approaches and skills. Quite often, applying the best practices and metrics of functional testing to load testing results in disappointments. It looks like many things that look trivial for an experienced performance engineer could drop from the attention of a person less experienced in this field that quite often results in unrealistic expectations, not optimal test planning and design, and misleading results. Here we try to outline some issues to be taken into consideration for performance testing and point to typical pitfalls from the practical point of view. The list was chosen to contrast load and functional testing; the things in common for both of them are not discussed here. Although most of the recommendations are still valid for functional testing (as mentioned, even the border between them is somewhat fuzzy), they are much more important for performance testing. The selection is based on the extensive experience of Hyperion s Performance Engineering group, multiple discussions with experts in load testing, and best papers and books about Performance Engineering. The list is illustrated by the example of Hyperion Analyzer project. Hyperion Analyzer is a powerful interactive query and reporting tool. Analyzer allows analyzing sophisticated multidimensional and relational data in an easy-to-use graphical interface. There are four main components relevant to the project:
2 - The central part is a standard JSP-based J2EE application, the Analysis Server. Runs inside an Application Server (can be TomCat, WebLogic, or WebSphere). - Relational repository to keep all information (can be SQL Server, Oracle or DB2). - Data source (can be Hyperion Essbase, Hyperion Financial Management, or relational data). - The Java client a very powerful applet making a lot of processing on the client side. There are also other components (Thin HTML client, Windows client, Administrator tools, API). The project consisted of performance testing of Hyperion Analyzer against multidimensional database Hyperion Essbase for a large financial company. WebSphere (including vertical and horizontal clustering) was used as an Application Server and Microsoft SQL Server was used as a repository. Workload Implementation If you work with a new system (i.e. you never ran a load test against it before), the first question (as soon as you get some kind of testing requirements) is how you will create load. Are you going to generate it manually, use an available load-testing tool, or create a test harness? Manual testing could sometimes works if you want to simulate a small number of users, but even if well organized, it will probably introduce more variation in each test and make the test less reproducible. Workload implementation using a tool (software or hardware) is quite straightforward when the system has a pure HTML interface, but even if there is an applet on the client side, it could become a very serious research task, not to mention having to deal with proprietary protocols. Creating a test harness requires more knowledge about the system (for example, an API) and some programming. Each choice requires different skills, resources, and investments. Therefore, when starting a new load-testing project, the first thing to do is to decide how the workload will be implemented and check that this way really works. As soon as you decided how to create the workload, you need to find a way to verify that the workload is really being applied. Hyperion Analyzer was completely re-designed in version 6. Version 5 was a client-server application; version 6 is a standard J2EE three-tier application. The Analyzer Java client is a very powerful applet communication with the Analyzer server using HTTP. The server response came in the body of HTTP requests as a serialized Java object including session ID. The first problem was to record the communication the applet is starting in a separate window and LoadRunner, recording communication between the Internet explorer and the server, created almost empty scripts. Using multiprotocol recording allowed to solve the problem. Efforts to record and playback the HTTP communication still failed - it didn t work without session ID parameterization. The existing API was too fat the effort to make a test harness failed due to contention on the client side even with the limited number of users. Finally
3 Mercury provided us with a utility to convert the script from C to Java and after that, using the Analyzer API calls, we de-serialize the object and extract the session ID. Workload verification Unfortunately, a lack of error messages during a load test does not mean that system works correctly. A very important part of load testing is workload verification you should be sure that the applied workload is doing what it is supposed to do -- that all errors are caught and logged. It can be done directly (analyzing server responses) or, in cases when this is impossible, indirectly (for example, analyzing the application log for the existence of particular entries). Many tools provide some way to verify workload and check errors, but a complete understanding of what exactly is happening is necessary. For example, Mercury Interactive s LoadRunner reports only HTTP errors for Web scripts by default (like 500 Internal Server Error ). If you rely on default diagnostics, you could still believe that everything is going well when you get out of memory errors instead of requested reports. To catch such kind of errors you should add the special command to check the content of HTML pages returned by the server to your script and enable such kind of checks in run-time options. As far as we get serialized Java objects from the Analyzer server we haven t any easy way to check that it worked correctly. De-serializing and analysis of each object require a lot of work and put significant overheads on the client (LoadRunner). Finally, we ended up using a utility to analyze Essbase log. It calculates the number of queries processed by Essbase. If this number matches what should be, we can suppose that it works. Unspecified requirements Usually when people are talking about performance (load) testing, they do not make distinctions among tuning, diagnostics, or capacity planning. Pure performance testing, just getting performance metrics, unfortunately is possible only in rare cases like benchmarking, when the system and all optimal settings are well known. Therefore, some tuning activities are necessary at the beginning of the testing to be sure that the system is properly tuned and the results are meaningful. In most cases, if a performance (or reliability, etc.) problem were found, it should be diagnosed further up to the point when it would be clear how to handle it. Generally speaking, performance testing, tuning, diagnostics, and capacity planning are quite different processes and not including explicitly any of them (if they are really assumed) in the test plan would make it unrealistic from the start. The Analyzer project was formulated as quick performance check, but we still return to it from time to time until now, after almost 2 years. Necessity to tune application server parameters, what-if analysis (especially concerning WebSphere clusters), tracing down and fixing found problems took much more time than testing itself. Exploring the System At the beginning of a new project, it is good to run some tests to figure out how the system behaves before creating formal plans. If no performance tests have been run, there is no ways to predict how many users the system can support and how each scenario will affect overall performance. Do we have any functional problems - can we do all
4 requested scenarios manually? Will we run into any problem with several users? Do we have enough computer resources to support the requested scenarios? The customer did some performance testing in parallel (it was quite big and sophisticated deployment). They called and told that Analyzer hangs with database X. When we came onsite and ran just one user monitoring all involved servers we found that running a single report takes two minutes and completely occupies two processors (from 4) on the Essbase (database) server. Starting even 10 users completely clogged the system and created an impression that Analyzer hangs, but really the database was overloaded. The problem was the bad database design and any further load testing had no sense until the design was changed. Time Each performance test takes some time, often significantly more than a functional test. Generally, we are interested in the steady mode during load testing, i.e. when all requested users are in the system. It means that we need time for all users to login and work for a meaningful period to be sure that we are in the steady mode. Moreover, some kinds of testing (reliability, for example) could require significant amount of time from several hours to several days. Therefore, the number of tests that could be run per day is limited. This could be especially important during tuning or diagnostics, when the number of iterations is unknown. The customer specified very complicated user scenarios. They wanted to include almost each their report into the scenario. Even after significant simplification a run took 2.5 hours. As soon as we started to tune WebSphere parameters, it became a real disaster. So we end up in simplified version to run during tuning. Take a systematic approach to changes The tuning (and often diagnostic) process consists of making changes in the system and evaluating their impact on performance (or problems). It is very important to take a systematic approach to these changes. It could be, for example, the traditional approach of one change at a time (also often referred as one factor at a time - OFAT) or using design of experiments (DOE) theory. One change at a time here does not mean changing only one variable; it can mean changing several related variables to check a particular hypothesis. The relationship between changes in the system parameters and changes in the product behavior is usually quite complex. Any assumption based on common sense can be wrong. A system s reaction can be quite paradoxical under heavy load. So changing several things at once without a systematic approach will not give you an understanding how each change affects results and could mess up the testing process and lead to incorrect conclusions. Of course, all changes and their impacts should be logged to allow rollback and further analysis. We needed to move tests to another set of equipment (as far as they took much more time and we needed to return machines we used). We also get advice from IBM about WebSphere tuning and slightly modified repository from the customer. Due to urgency, we implemented everything on the new equipment and got much worse results. We spent a lot of
5 time backing up configuration setting and analyzing hardware performance until figured out that the problem was due to increased number of users in the repository. Making one of this three changes in a time, although required a little more time, finally would saved us significantly more time. Data The size and structure of data usually affect load test results dramatically. Using a small sample set of data for performance tests is an easy way to get misleading results. There is no way to predict how the data size affects performance before real testing. The closer the test data is to production data, the better (although testing with larger data sets makes a lot of sense to ensure that the system will work when more data had been accumulated). Running multiple users hitting the same set of data is an easy way to get misleading results. This data, for example, can be completely cached and you get much better results than in production. However, it can cause concurrency issues and you will get much worse results than in production. Scripts and test harnesses usually should be parameterized so that each user uses the proper set of data ( proper could be different depending on the test requirements). Another easy trap with data is to add new data during the tests without special care. Each time it would be different tests with a different amount of data in the system. The proper way of running such tests is to restore the system to the original state after each test (or make additional tests to prove that it does not matter in this particular case). Further analysis of the problem with the increased number of users in the repository using SQL server profiler pinpointed a SQL statement making outer join of three tables and the number of records in each table was directly proportional to the number of users. When there were 200 users in the repository, it worked fine. As soon as the number of users was increased to 800, this SQL statement became the main bottleneck. Process Three specific features of load testing affect the testing process and often require more close work with development to fix problems than in functional testing. First, quite often, a reliability or performance problem blocks further performance testing until the problem is fixed or a workaround is found. Second, usually the full setup as is (that often is very sophisticated) should be used to reproduce the problem. Keeping the full setup for a long time could be expensive or even impossible. Third, debugging performance problems is a quite sophisticated diagnostic process usually requiring close collaboration between a performance engineer (running tests and analyzing the results) and a developer (profiling and altering code). Many tools (like debuggers) work fine in a one-user environment, but do not work in the multi-user environment (for example, due to huge performance overheads). These three features make it difficult to use an asynchronous process (usually used by functional testing testers look for bugs and log them into a defect tracking system, and then the defects are prioritized and fixed by development independently) in load testing. What is required is the synchronized work of performance engineering and development to fix the problems and complete performance testing.
6 Returning to the SQL statement problem with three outer join, we weren t able to test the full customer setup until the problem was fixed. A workaround was to remove most of users to test performance of other functionality. Although the problem was fully diagnosed, it took some time to communicate it. It was difficult to explain until we communicate it to the development manager and the developer responsible for this code. Combinatory explosion Even in functional testing, we have an unlimited number of test cases and the art of testing is to choose a limited set of test cases that could check product functionality in the best way with given resource limitations. It is much worse with load testing: each user can follow a different scenario (a sequence of functional steps) and even how the steps of one user relate to the steps of another user during the test could affect results drastically. So generally, it is the same art, but you have significantly more choices and significantly less efforts to check them (see time note above). Load testing cannot be comprehensive, so you should choose a subset of tests limited by your resources that would provide you with maximal amount of information (criteria can be quite different typical scenarios, risk minimization, etc.). We ended up with a few realistic scenarios (running existing reports and creating ad-hoc reports) that covers a small part of functionality but still allow to pinpoint performance problems and get understanding of performance limits. In parallel, we investigated a separate performance problems reported by the customer. Result analysis Usually test results of load testing are difficult to interpret as passed/failed. Even if we do not need to tune/diagnose anything, we usually should consider not only transaction response times for all different transactions (usually using aggregating metrics like average response time or 90% percentile), but also other metrics like resource utilization. Result analysis of load testing for enterprise-level systems could be quite difficult and should be based on knowledge of the system and the requirements and involve all possible sources of information: measured metrics, results of monitoring during the test, all available logs, and profiling results (if available). For example, heavy load on load generator machines could completely skew results and the only way to know that is to monitor those machines. There is always a variation in results of multi-user tests due to minor differences in the test environment. If this difference is large, it is worth checking why and adjusting tests accordingly for example, re-start the program (or even re-boot the system) before each run to eliminate caching effects. If the system is overloaded, the difference could be very significant. The SQL statement problem with three outer join was really found during the result analysis. We noticed that the pattern of load changed: before the main load was on the application server with 10-15% of cpu utilization on the database server, then it became vice versa heavy load on the database server and moderate load on the application server. So attention was moved to the database server and the problem was soon pinpointed. As was mentioned in discussion about workload verification, result analysis for Analyzer included the analysis of the Essbase log that each query was really processed. Other data
7 included results from LoadRunner (with statistical information) and monitoring results from all tiers: the application server, the database server (repository), the HTTP server, the Essbase server (data source) as well as client machines with LoadRunner (we make additional operations on this client machine with parsing and processing of server responses so need to keep an eye on it). The above content is not instructions to follow literally, just some things you should keep in mind while conducting performance testing. It is possibly to cut corners in many particular cases, but it would probably be good to check your plan against the suggestions above and explain to yourself why you doing it in a different way.
The Association of System Performance Professionals
The Association of System Performance Professionals The Computer Measurement Group, commonly called CMG, is a not for profit, worldwide organization of data processing professionals committed to the measurement
Test Run Analysis Interpretation (AI) Made Easy with OpenLoad
Test Run Analysis Interpretation (AI) Made Easy with OpenLoad OpenDemand Systems, Inc. Abstract / Executive Summary As Web applications and services become more complex, it becomes increasingly difficult
How To Test A Web Server
Performance and Load Testing Part 1 Performance & Load Testing Basics Performance & Load Testing Basics Introduction to Performance Testing Difference between Performance, Load and Stress Testing Why Performance
Performance Testing Process A Whitepaper
Process A Whitepaper Copyright 2006. Technologies Pvt. Ltd. All Rights Reserved. is a registered trademark of, Inc. All other trademarks are owned by the respective owners. Proprietary Table of Contents
Agile Aspects of Performance Testing. March 1, 2013. Page 1
Agile Aspects of Performance Testing Alexander Podelko [email protected] www.alexanderpodelko.com @apodelko March 1, 2013 1 About Me Specialize in performance last 16 years Currently performance
A closer look at HP LoadRunner software
Technical white paper A closer look at HP LoadRunner software Table of contents Sizing up the system 2 The limits of manual testing 2 A new take on testing: the HP LoadRunner solution 3 The HP LoadRunner
Resource Monitoring During Performance Testing. Experience Report by Johann du Plessis. Introduction. Planning for Monitoring
Resource Monitoring During Performance Testing Experience Report by Johann du Plessis Introduction During a recent review of performance testing projects I completed over the past 8 years, one of the goals
Mike Chyi, Micro Focus Solution Consultant May 12, 2010
Mike Chyi, Micro Focus Solution Consultant May 12, 2010 Agenda Load Testing Overview, Best Practice: Performance Testing with Diagnostics Demo (?), Q&A Load Testing Overview What is load testing? Type
Chapter 1 - Web Server Management and Cluster Topology
Objectives At the end of this chapter, participants will be able to understand: Web server management options provided by Network Deployment Clustered Application Servers Cluster creation and management
Business Application Services Testing
Business Application Services Testing Curriculum Structure Course name Duration(days) Express 2 Testing Concept and methodologies 3 Introduction to Performance Testing 3 Web Testing 2 QTP 5 SQL 5 Load
WHAT WE NEED TO START THE PERFORMANCE TESTING?
ABSTRACT Crystal clear requirements before starting an activity are always helpful in achieving the desired goals. Achieving desired results are quite difficult when there is vague or incomplete information
A Performance Engineering Story
CMG'09 A Performance Engineering Story with Database Monitoring Alexander Podelko [email protected] 1 Abstract: This presentation describes a performance engineering project in chronological order. The
<Insert Picture Here> Java Application Diagnostic Expert
Java Application Diagnostic Expert Agenda 1. Enterprise Manager 2. Challenges 3. Java Application Diagnostics Expert (JADE) 4. Feature-Benefit Summary 5. Features Overview Diagnostic
Performance Testing of Java Enterprise Systems
Performance Testing of Java Enterprise Systems Katerina Antonova, Plamen Koychev Musala Soft Why Performance Testing? Recent studies by leading USA consultancy companies showed that over 80% of large corporations
Web Application Testing. Web Performance Testing
Web Application Testing Web Performance Testing Objectives of Performance Testing Evaluate runtime compliance to performance requirements Check different properties such as throughput (bits/sec, packets/sec)
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance Abstract: Initial load tests revealed that the capacity of a customized Microsoft Office SharePoint Server (MOSS) website cluster
Case Study - I. Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008.
Case Study - I Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008 Challenges The scalability of the database servers to execute batch processes under
Performance Test Results Report for the Sled player
Performance Test Results Report for the Sled player The Open University Created: 17 th April 2007 Author Simon Hutchinson The Open University Page 1 of 21 Cross References None
Learning More About Load Testing
Welcome to this introduction to application performance testing and the LoadRunner load testing solution. This document provides a short overview of LoadRunner s features, and includes the following sections:
Recommendations for Performance Benchmarking
Recommendations for Performance Benchmarking Shikhar Puri Abstract Performance benchmarking of applications is increasingly becoming essential before deployment. This paper covers recommendations and best
Holistic Performance Analysis of J2EE Applications
Holistic Performance Analysis of J2EE Applications By Madhu Tanikella In order to identify and resolve performance problems of enterprise Java Applications and reduce the time-to-market, performance analysis
Response Time Analysis
Response Time Analysis A Pragmatic Approach for Tuning and Optimizing SQL Server Performance By Dean Richards Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 866.CONFIO.1 www.confio.com
Monitoring applications in multitier environment. Uroš Majcen [email protected]. A New View on Application Management. www.quest.
A New View on Application Management www.quest.com/newview Monitoring applications in multitier environment Uroš Majcen [email protected] 2008 Quest Software, Inc. ALL RIGHTS RESERVED. Management Challenges
Directions for VMware Ready Testing for Application Software
Directions for VMware Ready Testing for Application Software Introduction To be awarded the VMware ready logo for your product requires a modest amount of engineering work, assuming that the pre-requisites
Copyright www.agileload.com 1
Copyright www.agileload.com 1 INTRODUCTION Performance testing is a complex activity where dozens of factors contribute to its success and effective usage of all those factors is necessary to get the accurate
Response Time Analysis
Response Time Analysis A Pragmatic Approach for Tuning and Optimizing Oracle Database Performance By Dean Richards Confio Software, a member of the SolarWinds family 4772 Walnut Street, Suite 100 Boulder,
Readme File for All Platforms
Essbase Spreadsheet Services Release 7.1 Readme File for All Platforms This file contains the following sections: What is Essbase Spreadsheet Services?... 1 New Features in this Release... 2 Platforms
The presentation describes a load testing project chronologically. The scope of the project was to test task management software for performance.
The presentation describes a load testing project chronologically. The scope of the project was to test task management software for performance. It was a new, multi-tier Java application using AJAX technologies
Copyrighted www.eh1infotech.com +919780265007, 0172-5098107 Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),
Content of 6 Months Software Testing Training at EH1-Infotech Module 1: Introduction to Software Testing Basics of S/W testing Module 2: SQA Basics Testing introduction and terminology Verification and
Bringing Value to the Organization with Performance Testing
Bringing Value to the Organization with Performance Testing Michael Lawler NueVista Group 1 Today s Agenda Explore the benefits of a properly performed performance test Understand the basic elements of
W21. Performance Testing: Step On It. Nadine Pelicaen. P r e s e n t a t i o n
Performance Testing: Step On It Nadine Pelicaen International Conference On Software Testing, Analysis & Review November 19-23 Stockholm, Sweden P r e s e n t a t i o n W21 Friday 23rd November, 2001 Wednesday
How To Manage An Sap Solution
... Foreword... 17... Acknowledgments... 19... Introduction... 21 1... Performance Management of an SAP Solution... 33 1.1... SAP Solution Architecture... 34 1.1.1... SAP Solutions and SAP Components...
IBM RATIONAL PERFORMANCE TESTER
IBM RATIONAL PERFORMANCE TESTER Today, a major portion of newly developed enterprise applications is based on Internet connectivity of a geographically distributed work force that all need on-line access
Product Review: James F. Koopmann Pine Horse, Inc. Quest Software s Foglight Performance Analysis for Oracle
Product Review: James F. Koopmann Pine Horse, Inc. Quest Software s Foglight Performance Analysis for Oracle Introduction I ve always been interested and intrigued by the processes DBAs use to monitor
Web Application s Performance Testing
Web Application s Performance Testing B. Election Reddy (07305054) Guided by N. L. Sarda April 13, 2008 1 Contents 1 Introduction 4 2 Objectives 4 3 Performance Indicators 5 4 Types of Performance Testing
The Evolution of Load Testing. Why Gomez 360 o Web Load Testing Is a
Technical White Paper: WEb Load Testing To perform as intended, today s mission-critical applications rely on highly available, stable and trusted software services. Load testing ensures that those criteria
Basic Unix/Linux 1. Software Testing Interview Prep
Basic Unix/Linux 1 Programming Fundamentals and Concepts 2 1. What is the difference between web application and client server application? Client server application is designed typically to work in a
How To Test For Performance
: Roles, Activities, and QA Inclusion Michael Lawler NueVista Group 1 Today s Agenda Outline the components of a performance test and considerations Discuss various roles, tasks, and activities Review
27 th March 2015 Istanbul, Turkey. Performance Testing Best Practice
27 th March 2015 Istanbul, Turkey Performance Testing Best Practice Your Host.. Ian Molyneaux Leads the Intechnica performance team More years in IT than I care to remember Author of The Art of Application
Load Testing Hyperion Applications Using Oracle Load Testing 9.1
Oracle White Paper Load Testing Hyperion System 9 HFM An Oracle White Paper May 2010 Load Testing Hyperion Applications Using Oracle Load Testing 9.1 Oracle White Paper Load Testing Hyperion System 9 HFM
Introduction to WebSphere Administration
PH073-Williamson.book Page 1 Thursday, June 17, 2004 3:53 PM C H A P T E R 1 Introduction to WebSphere Administration T his book continues the series on WebSphere Application Server Version 5 by focusing
Troubleshooting.NET Applications - Knowing Which Tools to Use and When
Troubleshooting.NET Applications - Knowing Which Tools to Use and When Document Version 1.0 Abstract There are three fundamental classifications of problems encountered with distributed applications deployed
Performance Best Practices Guide for SAP NetWeaver Portal 7.3
SAP NetWeaver Best Practices Guide Performance Best Practices Guide for SAP NetWeaver Portal 7.3 Applicable Releases: SAP NetWeaver 7.3 Document Version 1.0 June 2012 Copyright 2012 SAP AG. All rights
Evaluation of Load/Stress tools for Web Applications testing
May 14, 2008 Whitepaper Evaluation of Load/Stress tools for Web Applications testing CONTACT INFORMATION: phone: +1.301.527.1629 fax: +1.301.527.1690 email: [email protected] web: www.hsc.com PROPRIETARY
TRACE PERFORMANCE TESTING APPROACH. Overview. Approach. Flow. Attributes
TRACE PERFORMANCE TESTING APPROACH Overview Approach Flow Attributes INTRODUCTION Software Testing Testing is not just finding out the defects. Testing is not just seeing the requirements are satisfied.
Table of Contents INTRODUCTION... 3. Prerequisites... 3 Audience... 3 Report Metrics... 3
Table of Contents INTRODUCTION... 3 Prerequisites... 3 Audience... 3 Report Metrics... 3 IS MY TEST CONFIGURATION (DURATION / ITERATIONS SETTING ) APPROPRIATE?... 4 Request / Response Status Summary...
ITG Software Engineering
IBM WebSphere Administration 8.5 Course ID: Page 1 Last Updated 12/15/2014 WebSphere Administration 8.5 Course Overview: This 5 Day course will cover the administration and configuration of WebSphere 8.5.
Performance Testing. What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing
Performance Testing What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing What is Performance Testing l The primary goal of Performance
Response Time Analysis
Response Time Analysis A Pragmatic Approach for Tuning and Optimizing Database Performance By Dean Richards Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 866.CONFIO.1 www.confio.com Introduction
Identify and control performance and capacity risks. Introduction... 2
Application performance testing in VMware environments Identify and control performance and capacity risks Table of contents Introduction... 2 Performance and capacity planning techniques... 2 Rough sizing
Tableau Server Scalability Explained
Tableau Server Scalability Explained Author: Neelesh Kamkolkar Tableau Software July 2013 p2 Executive Summary In March 2013, we ran scalability tests to understand the scalability of Tableau 8.0. We wanted
<Insert Picture Here> Application Testing Suite Overview
Application Testing Suite Overview Agenda Ats Overview OpenScript Functional Testing OpenScript Load Testing Forms/Siebel Modules Installation of Ats Oracle Load Tester Oracle Test
Business white paper. Load factor: performance testing for Web applications
Business white paper Load factor: performance testing for Web applications Table of contents 3 A look at load testing 3 In the pursuit of predictability 4 Put your apps through the paces 5 Traits of an
Performance and Load Testing For ArcGIS based systems Ian Sims and John Meza OVERVIEW What is Performance and Load Testing What is the objective Acceptance Testing Ongoing Development Areyoutheclient Want
Mohammed Khan SUMMARY
Mohammed Khan E-mail: [email protected] Phone: 347-878-1170 SUMMARY Over 5 years of diversified experience as a. Experience includes requirement analysis, manual testing and automation and quality
Tuning WebSphere Application Server ND 7.0. Royal Cyber Inc.
Tuning WebSphere Application Server ND 7.0 Royal Cyber Inc. JVM related problems Application server stops responding Server crash Hung process Out of memory condition Performance degradation Check if the
OpenLoad - Rapid Performance Optimization Tools & Techniques for CF Developers
OpenDemand Systems, Inc. OpenLoad - Rapid Performance Optimization Tools & Techniques for CF Developers Speed Application Development & Improve Performance November 11, 2003 True or False? Exposing common
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any
SolovatSoft. Load and Performance Test Plan Sample. Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13
SolovatSoft Load and Performance Test Plan Sample Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13 Approval signatures Project Manager Development QA Product Development
Dell One Identity Manager Scalability and Performance
Dell One Identity Manager Scalability and Performance Scale up and out to ensure simple, effective governance for users. Abstract For years, organizations have had to be able to support user communities
Development Best Practices
Development Best Practices 0 Toad Toad for Oracle v.9.6 Configurations for Oracle Standard Basic Toad Features + Team Coding + PL/SQL Profiler + PL/SQL Debugging + Knowledge Xpert PL/SQL and DBA Toad for
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box)
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box) SQL Server White Paper Published: January 2012 Applies to: SQL Server 2012 Summary: This paper explains the different ways in which databases
Whitepaper: performance of SqlBulkCopy
We SOLVE COMPLEX PROBLEMS of DATA MODELING and DEVELOP TOOLS and solutions to let business perform best through data analysis Whitepaper: performance of SqlBulkCopy This whitepaper provides an analysis
Open Source and Commercial Performance Testing Tools
Open Source and Commercial Performance Testing Tools Palla Vinod Kumar Accenture Delivery Center for Technology in India Accenture, its logo, and High Performance Delivered are trademarks of Accenture.
Ensuring Web Service Quality for Service-Oriented Architectures. An Oracle White Paper June 2008
Ensuring Web Service Quality for Service-Oriented Architectures An Oracle White Paper June 2008 Ensuring Web Service Quality for Service-Oriented Architectures WEB SERVICES OFFER NEW OPPORTUNITIES AND
Load Testing Analysis Services Gerhard Brückl
Load Testing Analysis Services Gerhard Brückl About Me Gerhard Brückl Working with Microsoft BI since 2006 Mainly focused on Analytics and Reporting Analysis Services / Reporting Services Power BI / O365
Performance Prediction, Sizing and Capacity Planning for Distributed E-Commerce Applications
Performance Prediction, Sizing and Capacity Planning for Distributed E-Commerce Applications by Samuel D. Kounev ([email protected]) Information Technology Transfer Office Abstract Modern e-commerce
Features of The Grinder 3
Table of contents 1 Capabilities of The Grinder...2 2 Open Source... 2 3 Standards... 2 4 The Grinder Architecture... 3 5 Console...3 6 Statistics, Reports, Charts...4 7 Script... 4 8 The Grinder Plug-ins...
An Oracle White Paper March 2013. Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite
An Oracle White Paper March 2013 Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite Executive Overview... 1 Introduction... 1 Oracle Load Testing Setup... 2
Performance Testing for BMC Remedy IT Service Management Suite
Test and Performance Platform Stress Testing Load Testing Capacity Test Soak Testing Scalability Testing Performance Testing Benchmarking Reliability Testing Performance Tuning Performance Optimization
http://support.oracle.com/
Oracle Primavera Contract Management 14.0 Sizing Guide October 2012 Legal Notices Oracle Primavera Oracle Primavera Contract Management 14.0 Sizing Guide Copyright 1997, 2012, Oracle and/or its affiliates.
Automate performance testing to predict system behavior and improve application performance. White paper
Automate performance testing to predict system behavior and improve application performance White paper Table of contents Abstract.........................................................................3
5Get rid of hackers and viruses for
Reprint from TechWorld /2007 TEChWoRLd ISSuE 2007 ThEBIG: 5 FIREWaLLS TEChWoRLd ISSuE 2007 ThEBIG: 5 FIREWaLLS TEChWoRLd ISSuE 2007 ThEBIG: 5 FIREWaLLS # # # Load balancing is basically a simple task where
Contents Introduction... 5 Deployment Considerations... 9 Deployment Architectures... 11
Oracle Primavera Contract Management 14.1 Sizing Guide July 2014 Contents Introduction... 5 Contract Management Database Server... 5 Requirements of the Contract Management Web and Application Servers...
Oracle Enterprise Manager 13c Cloud Control
Oracle Enterprise Manager 13c Cloud Control ORACLE DIAGNOSTICS PACK FOR ORACLE DATABASE lace holder for now] Oracle Enterprise Manager is Oracle s integrated enterprise IT management product line, and
APPLICATION PERFORMANCE TESTING IN A VIRTUAL ENVIRONMENT
APPLICATION PERFORMANCE TESTING IN A VIRTUAL ENVIRONMENT A REPORT FROM SAP CO-INNOVATION LAB : Joerg Nalik Shunra Software: Dave Berg HP Software: Mark Tomlinson 1.0 Table of Contents 1 Introduction Raising
An Oracle White Paper July 2011. Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide
Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide An Oracle White Paper July 2011 1 Disclaimer The following is intended to outline our general product direction.
HOW TO EVALUATE AND SELECT TOOL A HIGH-END LOAD TESTING. Marquis Harding Reality Test P R E S E N T A T I O N. Presentation. Bio
Presentation P R E S E N T A T I O N Bio E6 Thursday, March 8, 2001 11:30 AM HOW TO EVALUATE AND SELECT A HIGH-END LOAD TESTING TOOL Marquis Harding Reality Test International Conference On Software Test
SOFTWARE PERFORMANCE TESTING SERVICE
SOFTWARE PERFORMANCE TESTING SERVICE Service Definition GTS s performance testing services allows customers to reduce the risk of poor application performance. This is done by performance testing applications
PERFORMANCE TESTING. New Batches Info. We are ready to serve Latest Testing Trends, Are you ready to learn.?? START DATE : TIMINGS : DURATION :
PERFORMANCE TESTING We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info START DATE : TIMINGS : DURATION : TYPE OF BATCH : FEE : FACULTY NAME : LAB TIMINGS : Performance
Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1
Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1 This document supports the version of each product listed and supports all subsequent versions until the document
The ROI of Test Automation
The ROI of Test Automation by Michael Kelly www.michaeldkelly.com Introduction With the exception of my first project team out of college, in every project team since, I ve had to explain either what automated
White paper: Unlocking the potential of load testing to maximise ROI and reduce risk.
White paper: Unlocking the potential of load testing to maximise ROI and reduce risk. Executive Summary Load testing can be used in a range of business scenarios to deliver numerous benefits. At its core,
APPLICATION MANAGEMENT SUITE FOR SIEBEL APPLICATIONS
APPLICATION MANAGEMENT SUITE FOR SIEBEL APPLICATIONS USER EXPERIENCE MANAGEMENT SERVICE LEVEL OBJECTIVE REAL USER MONITORING SYNTHETIC USER MONITORING SERVICE TEST KEY PERFORMANCE INDICATOR PERFORMANCE
JReport Server Deployment Scenarios
JReport Server Deployment Scenarios Contents Introduction... 3 JReport Architecture... 4 JReport Server Integrated with a Web Application... 5 Scenario 1: Single Java EE Server with a Single Instance of
Application and Web Load Testing. Datasheet. Plan Create Load Analyse Respond
Application and Web Load Testing Datasheet Plan Create Load Analyse Respond Product Overview JAR:load is an innovative web load testing solution delivered from the Cloud* for optimising the performance
Rapid Bottleneck Identification A Better Way to do Load Testing. An Oracle White Paper June 2009
Rapid Bottleneck Identification A Better Way to do Load Testing An Oracle White Paper June 2009 Rapid Bottleneck Identification A Better Way to do Load Testing. RBI combines a comprehensive understanding
Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:
Performance Testing Definition: Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program or device. This process can involve
White Paper. The Ten Features Your Web Application Monitoring Software Must Have. Executive Summary
White Paper The Ten Features Your Web Application Monitoring Software Must Have Executive Summary It s hard to find an important business application that doesn t have a web-based version available and
Transforming LoadRunner Data into Information and Action
2010-02-01 Transforming LoadRunner Data into Information and Action Introduction Today s online web applications need to deliver high efficiency and stability while supporting thousands of users simultaneously.
Oracle Hyperion Financial Management Virtualization Whitepaper
Oracle Hyperion Financial Management Virtualization Whitepaper Oracle Hyperion Financial Management Virtualization Whitepaper TABLE OF CONTENTS Overview... 3 Benefits... 4 HFM Virtualization testing...
Application Testing Suite Oracle Load Testing Introduction
Application Testing Suite Oracle Load Testing Introduction ATS Load Testing Workshop Bangalore, India September 24 / 25 2012 Yutaka Takatsu ATS Group Product Manager Oracle Enterprise Manager - ATS 1 Agenda
An Oracle White Paper February 2010. Rapid Bottleneck Identification - A Better Way to do Load Testing
An Oracle White Paper February 2010 Rapid Bottleneck Identification - A Better Way to do Load Testing Introduction You re ready to launch a critical Web application. Ensuring good application performance
Real Application Testing. Fred Louis Oracle Enterprise Architect
Real Application Testing Fred Louis Oracle Enterprise Architect The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated
Throughput Capacity Planning and Application Saturation
Throughput Capacity Planning and Application Saturation Alfred J. Barchi [email protected] http://www.ajbinc.net/ Introduction Applications have a tendency to be used more heavily by users over time, as the
Guideline for stresstest Page 1 of 6. Stress test
Guideline for stresstest Page 1 of 6 Stress test Objective: Show unacceptable problems with high parallel load. Crash, wrong processing, slow processing. Test Procedure: Run test cases with maximum number
Load Testing and Monitoring Web Applications in a Windows Environment
OpenDemand Systems, Inc. Load Testing and Monitoring Web Applications in a Windows Environment Introduction An often overlooked step in the development and deployment of Web applications on the Windows
