Performance Test Results Report for the Sled player
|
|
|
- Amie Randall
- 10 years ago
- Views:
Transcription
1 Performance Test Results Report for the Sled player The Open University Created: 17 th April 2007 Author Simon Hutchinson The Open University Page 1 of 21
2 Cross References None Revision History Version Date Author Reason th Apr Simon Initial Version 2007 Hutchinson File Sled Performance Testing Results.pdf The Open University Page 2 of 21
3 Table of Contents Table of Contents... 3 Summary of results...4 Scope...5 Purpose...5 Testing Application...5 Test Description...5 Performance Acceptance Criteria... 6 Introduction...6 Performance Criteria...6 Objectives... 6 Engagement Complete Criteria...6 System Architecture...7 Baseline Test... 8 Introduction...8 Baseline Results... 9 Benchmarking Introduction...10 Benchmark Results 20 users Other scheduled test results Scheduled Tests...12 User Experience Tests...12 User Experience Test Results User load User load User load (1.5X the acceptance load) User load (2X the acceptance load) Stability Tests...17 Stress test Stress Test Results Conclusions and recommendations Consolidated Results...20 Conclusions...20 Recommendations... Error! Bookmark not defined. Appendix The Open University Page 3 of 21
4 Summary of results All criteria were met: The Performance Testing effort was completed 14 th April The Performance Tester discovered that on the hardware and software configuration tested, that a 100 concurrent user load produced an acceptable user experience within the guidelines of a 7 second response time range 95% of the time. The Open University Page 4 of 21
5 Scope This document includes a summary of performance testing results and conclusions/recommendations for the Sled application. This document does not address functional testing, nor does it address detailed application tuning. Purpose The purpose of this document is to report on the Sled actual performance as compared to the acceptance criteria enumerated in the Performance Testing Strategy section below. Specifically, this document details the: Performance Acceptance Criteria Measurements gathered for the application Summary of tests performed Summary of measurements collected Results and conclusion Appendices for supporting data Testing Application The tool used for the performance testing was Open Systems Testing Architecture (OpenSTA). This tool was selected for a number of reasons amongst which the most important were. 1. Freely available under the GNU GPL (General Public License) 2. The ability to record a web session. 3. APIs (Script Control Language) for fine grained control over test scripts. 4. A full suite of test reports. 5. A community portal and mailing list. Test Description The test utilized the Developing Multimedia UoL, which was both developed at and is in use at Liverpool Hope University. The UoL is at Learning design Level B. A user session was recorded which completed each activity in weeks one and two of the UoL. A constant wait time of 30 seconds was then applied between each request. To ensure that responses were being returned accurately during the test the script was amended to parse the response text for an expected string and to log a success or failure message based on its presence. For each test (See test description for each test) a number of virtual users were assigned at varying schedules. Each virtual user is a representation of a single user s browser session, complete with cookies for session handling. The virtual users run concurrently but are started at different times. Each virtual user completes a number of iterations of the scripted browser session. The system was setup with a single run of the UoL and 250 users were assigned its Learner role. The JBoss server was restarted between tests. The Open University Page 5 of 21
6 Performance Acceptance Criteria Introduction Performance Test Results Report for the Sled player Performance efforts always have two sets of criteria associated with them. The first are performance criteria (requirements and goals), and the second are engagement completion criteria. In the sections below, both types of criteria are explained in general and in specific detail for the Sled performance testing effort. The performance effort will be deemed complete when either all of the performance criteria are met, or any one of the engagement completion criteria is met. Performance Criteria Performance criteria are the specific target performance requirements and goals of the system under test. The preferred result is that the application meets all of these goals and requirements currently and/or tunes the application until these goals are met. If this is not possible, at least one of the engagement criteria from the next section must be met for overall performance acceptance. Objectives The objectives of the Performance Testing Effort are: To validate the scalability and operability of the technical architecture on a shared platform (up to 100 concurrent users) To validate system performance of : o o o All user actions that require a page or screen to be loaded or refreshed will be fully displayed in 7 seconds 95% of the time when accessed over a 10Mbs LAN while there is a 100 user load on the system. To validate that the system does not exhibit any critical failures under stress (unrealistic load) Identify and ensure that performance issues uncovered outside of the stated performance criteria are documented. Engagement Complete Criteria In cases where performance requirements or goals cannot be achieved due to situations outside of the control of the Performance Testing Team, the performance effort will be considered complete when any of the following conditions are met: All bottle necks preventing the application from achieving the performance criteria are determined to be outside Performance Testers control The Performance Tester and stakeholders agree that the application performs acceptably, although some performance requirements or goals have not been achieved. The Open University Page 6 of 21
7 System Architecture A standard system architecture as detailed below has been used for all tests. Test Client (Running OpenSTA to simulate Virtual Users) Operating System Windows XP Professional Service Pack 2 (build 2600) Processor 3.00 gigahertz Intel Pentium 4 8 kilobyte primary memory cache 512 kilobyte secondary memory cache Main Circuit Board Memory Communications Relevant Software & Configuration Board: Intel Corporation D865GLC AAC Serial Number: BTLC Bus Clock: 200 megahertz BIOS: Intel Corp. BF86510A.86A.0069.P /29/ Megabytes Installed Memory Intel(R) PRO/100 VE Network Connection OpenSTA Web server / App server Operating System Windows Server 2003 Standard Edition Service Pack 1 (build 3790) Processor 3.00 gigahertz Intel Pentium 4 16 kilobyte primary memory cache 1024 kilobyte secondary memory cache Main Circuit Board Memory Communications Relevant Software & Configuration Board: Intel Corporation D865GLC AAC Serial Number: BTLC Bus Clock: 200 megahertz BIOS: Intel Corp. BF86510A.86A.0071.P /24/ Megabytes Installed Memory Intel(R) PRO/100 VE Network Connection Java version Application Server Web server JRE 1.5.0_11-b03 [options for JBoss] -Xms128m - Xmx512m JBoss GA Tomcat (version shipped with JBoss) The Open University Page 7 of 21
8 Baseline Test Introduction Baseline results represent each user activity being performed by a single user over multiple iterations. These baselines were used primarily to validate that the scripts have been developed correctly. All baselines were executed a minimum of 30 times. All reported times are statistical calculations (averages) of all 30 iterations. The user wait time (the time between user interactions with the system) was exactly 30 seconds to ensure baseline tests are identical. The Open University Page 8 of 21
9 Baseline Results Performance Test Results Report for the Sled player Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE ACTIVITY_MESSAGE_FACILITY 0.1 ACTIVITY_MESSAGE 0.0 SET_PROPERTY_SEND_MESSAGE 0.0 COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ ACTIVITY_UPLOAD_ACTIVITY 0.0 QTI_POST_SELECT_TITLE 0.0 ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT Table 1 Baseline Results 95th Percentile Time ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT QTI_POST_SELECT_TITLE ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE ACTIVITY_MESSAGE_FACILITY LOGIN_COURSE 0.1 LOGIN Figure 1 Baseline chart The Open University Page 9 of 21
10 Benchmarking Introduction A benchmark, or light load, scenario is generally a small community of users compared to the target load. This community of users must be large enough to represent a reasonable sample of the entire user community. Executing these tests ensured that the testing environment behaves as expected under light load before more demanding testing begins. Additionally, the results of these tests are used as a benchmark to compare with all future test results. Performance results obtained under the benchmark load should meet or exceed all indicated performance requirement s; otherwise tuning must begin with the benchmark load. Assuming no performance problems are noticed during this scenario, the results obtained can be used as best case results. These results indicate how the system performs when it is not under noticeable stress, but is still performing the required functions, thus allowing conclusions to be drawn about the performance of the system during higher load tests. Sled will be benchmarked, in the environments described below. This benchmark is intended to provide a basis of comparison for future testing. Tuning may occur during the benchmarking effort if critical bottlenecks are detected. Sled can then be re-benchmarked each time an iteration of either tuning or development has been completed on a module. This ensures that there is always a known valid point of comparison for all scheduled tests. The benchmark load will be 20 users, entering the system over a 30 minute period and performing the tasks outlines in section for a further hour i.e. total test time = 90 minutes. Note: A subset of transactions from those completed within the UoL have been chosen for reporting purposes and indicate a good spread of the functionality. The appendices contain data for all transactions. The Open University Page 10 of 21
11 Benchmark Results 20 users Performance Test Results Report for the Sled player Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE ACTIVITY_MESSAGE_FACILITY 0.0 ACTIVITY_MESSAGE 0.0 SET_PROPERTY_SEND_MESSAGE 0.0 COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ 0.0 ACTIVITY_UPLOAD_ACTIVITY 0.0 QTI_POST_SELECT_TITLE 0.0 ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT Table 2 Benchmark Results (20 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT QTI_POST_SELECT_TITLE ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE ACTIVITY_MESSAGE_FACILITY LOGIN_COURSE LOGIN Figure 2 Benchmark Chart (20 users) The Open University Page 11 of 21
12 Other scheduled test results Scheduled Tests The Execute scheduled tests aspect includes those activities that are mandatory to validate the performance of the system. They are Execute User Experience Tests Execute Stability Tests User Experience Tests User Experience Tests constitute what are considered to be expected real-world loads, from best case to worst case. Applying less than the expected worst-case load is useful in identifying major failings in a system, but does so in a way that doesn t highlight many of the more minor failings, allowing an easier analysis of results. When the load is equivalent to the expected real-world worst-case load, actual performance of the system can be measure and associated problems can be clearly identified. These tests were designed to validate that the performance goals and requirements have been met. The results reported here represent the actual performance of the system upon conclusion of the Performance Testing effort. The system was tested under loads of 50, 100, 150 and 200 virtual users. User Experience Test Results Virtual users were gradually released into the system over a 30 minute period. Once the ramp up period was completed, each scenario iterated several times for a total of an hour of relatively consistent load. User think times (time between user interactions with the system) were 30 seconds. The page load times were measured in the same manner as they were under the Benchmark scenario to ensure consistency and validity between tests. Average times and 95 th percentile times have been reported as well as standard deviations. Note: A subset of transactions from those completed within the UoL have been chosen for reporting purposes and indicate a good spread of the functionality. The appendices contain data for all transactions. The Open University Page 12 of 21
13 50 User load Performance Test Results Report for the Sled player Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE 0.1 ACTIVITY_MESSAGE_FACILITY 0.1 ACTIVITY_MESSAGE 0.1 SET_PROPERTY_SEND_MESSAGE 0.0 COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ 0.1 ACTIVITY_UPLOAD_ACTIVITY 0.1 QTI_POST_SELECT_TITLE 0.1 ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT 0.7 Table 3 Experience Results (50 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT QTI_POST_SELECT_TITLE ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE ACTIVITY_MESSAGE_FACILITY LOGIN_COURSE LOGIN Figure 3 Experience Chart (50 users) The Open University Page 13 of 21
14 100 User load Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE 0.5 ACTIVITY_MESSAGE_FACILITY ACTIVITY_MESSAGE 0.4 SET_PROPERTY_SEND_MESSAGE 0.4 COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ 0.8 ACTIVITY_UPLOAD_ACTIVITY 1.0 QTI_POST_SELECT_TITLE ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT 0.4 Table 4 Experience Results (100 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT 0.4 QTI_POST_SELECT_TITLE 0.9 ACTIVITY_UPLOAD_ACTIVITY 1.0 COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ 0.8 SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE ACTIVITY_MESSAGE_FACILITY 0.9 LOGIN_COURSE 0.5 LOGIN Figure 4 Experience Chart (100 users) The Open University Page 14 of 21
15 150 User load (1.5X the acceptance load) Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN 0.1 LOGIN_COURSE ACTIVITY_MESSAGE_FACILITY ACTIVITY_MESSAGE SET_PROPERTY_SEND_MESSAGE COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ ACTIVITY_UPLOAD_ACTIVITY QTI_POST_SELECT_TITLE ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT Table 5 Experience Results (150 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT 3.1 QTI_POST_SELECT_TITLE ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ 3.6 SET_PROPERTY_SEND_MESSAGE 3.0 ACTIVITY_MESSAGE 2.8 ACTIVITY_MESSAGE_FACILITY 2.6 LOGIN_COURSE 2.4 LOGIN Figure 5 Experience Chart (150 users) The Open University Page 15 of 21
16 200 User load (2X the acceptance load) Performance Test Results Report for the Sled player Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE ACTIVITY_MESSAGE_FACILITY ACTIVITY_MESSAGE SET_PROPERTY_SEND_MESSAGE COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ ACTIVITY_UPLOAD_ACTIVITY QTI_POST_SELECT_TITLE ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT Table 6 Experience Results (200 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT QTI_POST_SELECT_TITLE ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE 48.3 ACTIVITY_MESSAGE_FACILITY 51.6 LOGIN_COURSE 44.1 LOGIN Figure 6 Experience Chart (200 users) The Open University Page 16 of 21
17 Stability Tests Stability scenarios test a system at and beyond the worse expected demand it is likely to face. The majority of critical deficiencies in the system will have already been identified during the execution of load tests, so this phase deals more with assessing the impact on performance and functionality under a heavy or reasonable load. Stability scenarios will also identify many other system bottlenecks not previously noticed, which may in fact be partially responsible for identified problems. Heavy load scenarios are generally designed to be far more than a system can handle. They are used not to identify if a system fails, but where it fails first, how badly and why. By answering the why question, it can be determined whether a system is as stable as it needs to be. Stress test Stress tests are tests that use real-world distributions and user communities, but under extreme conditions. It is common to execute stress tests that are 150% of expected peak user load sustained over 12 hours with normal ramp up and ramp down time (users entering over a 30 minute period). The Open University Page 17 of 21
18 Stress Test Results Transaction Avg Time Std Dev Time 95th Percentile Time LOGIN LOGIN_COURSE ACTIVITY_MESSAGE_FACILITY ACTIVITY_MESSAGE SET_PROPERTY_SEND_MESSAGE COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ ACTIVITY_UPLOAD_ACTIVITY QTI_POST_SELECT_TITLE ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT Table 7 Stress Results (150 users) 95th Percentile Time (seconds) ACTIVITY_COMPLETE_AND_UPLOAD_DIR_ACT 21.5 QTI_POST_SELECT_TITLE 24.0 ACTIVITY_UPLOAD_ACTIVITY COMPLETE_ACTIVITY_VIEW_SAMPLE_MM_PROJ SET_PROPERTY_SEND_MESSAGE ACTIVITY_MESSAGE 19.5 ACTIVITY_MESSAGE_FACILITY LOGIN_COURSE LOGIN Figure 7 Stress Chart (150 users) The Open University Page 18 of 21
19 Average Response Times by Hour - Stress Test (150 Vus) Average Response Time (ms) Hours into test Figure 8 Stress Chart Response times by hour (150 users) The Open University Page 19 of 21
20 Conclusions and recommendations Consolidated Results Performance Test Results Report for the Sled player Summary Comparison Statistic/ Concurrent Users Times Recorded Times > Goal % Times > Goal 0% 0% 0% 0% 51.40% Typical Average Time Typical 95th Percentile Time Conclusions The goal of < 5 second response time for a target load of 100 users was met 100% of the time. The goal was also met 100% of the time at 1.5 X the target load i.e. 150 users. Even at twice the target load close to 50% of the response times were within the specified goal duration. The results for 1, 50, 100 and 150 users show performance degrading within tight limits with the addition of a heavier user load. The 200 user load appears to have reached the limits of the test hardware in terms of response time however no exceptions were produced during the test. Based on the goals set in the performance acceptance criteria the system is ready for production use. The Open University Page 20 of 21
21 Appendix Due to the amount of data generated by these tests the data is not reproduced here but can be downloaded in full at: The Open University Page 21 of 21
How To Test On The Dsms Application
Performance Test Summary Report Skills Development Management System December 2014 Performance Test report submitted to National Skill Development Corporation Version Date Name Summary of Changes 1.0 22/12/2014
How To Test A Web Server
Performance and Load Testing Part 1 Performance & Load Testing Basics Performance & Load Testing Basics Introduction to Performance Testing Difference between Performance, Load and Stress Testing Why Performance
SOFTWARE PERFORMANCE TESTING SERVICE
SOFTWARE PERFORMANCE TESTING SERVICE Service Definition GTS s performance testing services allows customers to reduce the risk of poor application performance. This is done by performance testing applications
Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition
Liferay Portal Performance Benchmark Study of Liferay Portal Enterprise Edition Table of Contents Executive Summary... 3 Test Scenarios... 4 Benchmark Configuration and Methodology... 5 Environment Configuration...
How To Test For Elulla
EQUELLA Whitepaper Performance Testing Carl Hoffmann Senior Technical Consultant Contents 1 EQUELLA Performance Testing 3 1.1 Introduction 3 1.2 Overview of performance testing 3 2 Why do performance testing?
Informatica Data Director Performance
Informatica Data Director Performance 2011 Informatica Abstract A variety of performance and stress tests are run on the Informatica Data Director to ensure performance and scalability for a wide variety
Recommendations for Performance Benchmarking
Recommendations for Performance Benchmarking Shikhar Puri Abstract Performance benchmarking of applications is increasingly becoming essential before deployment. This paper covers recommendations and best
Application Performance Testing Basics
Application Performance Testing Basics ABSTRACT Todays the web is playing a critical role in all the business domains such as entertainment, finance, healthcare etc. It is much important to ensure hassle-free
Load/Stress Test Plan
WileyPLUS E5 Load/Stress Test Plan Version 1.1 Author: Cris J. Holdorph Unicon, Inc. 1 Audit Trail: Date Version Name Comment April 2, 2008 1.0 Cris J. Holdorph Initial Revision April 9, 2008 1.1 Cris
How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)
Scalability Results Select the right hardware configuration for your organization to optimize performance Table of Contents Introduction... 1 Scalability... 2 Definition... 2 CPU and Memory Usage... 2
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance Abstract: Initial load tests revealed that the capacity of a customized Microsoft Office SharePoint Server (MOSS) website cluster
What Is Specific in Load Testing?
What Is Specific in Load Testing? Testing of multi-user applications under realistic and stress loads is really the only way to ensure appropriate performance and reliability in production. Load testing
Oracle Database Performance Management Best Practices Workshop. AIOUG Product Management Team Database Manageability
Oracle Database Performance Management Best Practices Workshop AIOUG Product Management Team Database Manageability Table of Contents Oracle DB Performance Management... 3 A. Configure SPA Quick Check...6
Performance Testing. What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing
Performance Testing What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing What is Performance Testing l The primary goal of Performance
Web Application s Performance Testing
Web Application s Performance Testing B. Election Reddy (07305054) Guided by N. L. Sarda April 13, 2008 1 Contents 1 Introduction 4 2 Objectives 4 3 Performance Indicators 5 4 Types of Performance Testing
Load Testing and Monitoring Web Applications in a Windows Environment
OpenDemand Systems, Inc. Load Testing and Monitoring Web Applications in a Windows Environment Introduction An often overlooked step in the development and deployment of Web applications on the Windows
Load Testing of Loan Search Project Report
Load Testing of Loan Search Project Report In completion of MS Project in EECS Under the guidance of Dr Bose Bella by Jiten Pai Oregon State University Table of Contents 1 Introduction...1 1.1 Objective...1
Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:
Performance Testing Definition: Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program or device. This process can involve
Introduction 1 Performance on Hosted Server 1. Benchmarks 2. System Requirements 7 Load Balancing 7
Introduction 1 Performance on Hosted Server 1 Figure 1: Real World Performance 1 Benchmarks 2 System configuration used for benchmarks 2 Figure 2a: New tickets per minute on E5440 processors 3 Figure 2b:
SolovatSoft. Load and Performance Test Plan Sample. Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13
SolovatSoft Load and Performance Test Plan Sample Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13 Approval signatures Project Manager Development QA Product Development
WHAT WE NEED TO START THE PERFORMANCE TESTING?
ABSTRACT Crystal clear requirements before starting an activity are always helpful in achieving the desired goals. Achieving desired results are quite difficult when there is vague or incomplete information
Load Testing Analysis Services Gerhard Brückl
Load Testing Analysis Services Gerhard Brückl About Me Gerhard Brückl Working with Microsoft BI since 2006 Mainly focused on Analytics and Reporting Analysis Services / Reporting Services Power BI / O365
Performance Tuning and Optimizing SQL Databases 2016
Performance Tuning and Optimizing SQL Databases 2016 http://www.homnick.com [email protected] +1.561.988.0567 Boca Raton, Fl USA About this course This four-day instructor-led course provides students
Summer Internship 2013 Group No.4-Enhancement of JMeter Week 1-Report-1 27/5/2013 Naman Choudhary
Summer Internship 2013 Group No.4-Enhancement of JMeter Week 1-Report-1 27/5/2013 Naman Choudhary For the first week I was given two papers to study. The first one was Web Service Testing Tools: A Comparative
PERFORMANCE TESTING. New Batches Info. We are ready to serve Latest Testing Trends, Are you ready to learn.?? START DATE : TIMINGS : DURATION :
PERFORMANCE TESTING We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info START DATE : TIMINGS : DURATION : TYPE OF BATCH : FEE : FACULTY NAME : LAB TIMINGS : Performance
SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, 2012. Load Test Results for Submit and Approval Phases of Request Life Cycle
SOLUTION BRIEF: SLCM R12.7 PERFORMANCE TEST RESULTS JANUARY, 2012 Load Test Results for Submit and Approval Phases of Request Life Cycle Table of Contents Executive Summary 3 Test Environment 4 Server
OpenLoad - Rapid Performance Optimization Tools & Techniques for CF Developers
OpenDemand Systems, Inc. OpenLoad - Rapid Performance Optimization Tools & Techniques for CF Developers Speed Application Development & Improve Performance November 11, 2003 True or False? Exposing common
An Oracle White Paper March 2013. Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite
An Oracle White Paper March 2013 Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite Executive Overview... 1 Introduction... 1 Oracle Load Testing Setup... 2
Scalability Factors of JMeter In Performance Testing Projects
Scalability Factors of JMeter In Performance Testing Projects Title Scalability Factors for JMeter In Performance Testing Projects Conference STEP-IN Conference Performance Testing 2008, PUNE Author(s)
Test Run Analysis Interpretation (AI) Made Easy with OpenLoad
Test Run Analysis Interpretation (AI) Made Easy with OpenLoad OpenDemand Systems, Inc. Abstract / Executive Summary As Web applications and services become more complex, it becomes increasingly difficult
Performance Testing Process A Whitepaper
Process A Whitepaper Copyright 2006. Technologies Pvt. Ltd. All Rights Reserved. is a registered trademark of, Inc. All other trademarks are owned by the respective owners. Proprietary Table of Contents
Performance Testing Uncovered
Performance Testing Uncovered First Presented at: NobleStar Systems Corp. London, UK 26 Sept. 2003 Scott Barber Chief Technology Officer PerfTestPlus, Inc. Performance Testing Uncovered Page 1 Performance
Performance Optimization For Operational Risk Management Application On Azure Platform
Performance Optimization For Operational Risk Management Application On Azure Platform Ashutosh Sabde, TCS www.cmgindia.org 1 Contents Introduction Functional Requirements Non Functional Requirements Business
Virtual Desktops Security Test Report
Virtual Desktops Security Test Report A test commissioned by Kaspersky Lab and performed by AV-TEST GmbH Date of the report: May 19 th, 214 Executive Summary AV-TEST performed a comparative review (January
Endpoint Security Solutions Comparative Analysis Report
Endpoint Security Solutions Comparative Analysis Report (Physical Environment) Vendors Tested Trend Micro McAfee Symantec Sophos Microsoft To: Trend Micro Indusface Contact Kandarp Shah Vice President
Case Study - I. Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008.
Case Study - I Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008 Challenges The scalability of the database servers to execute batch processes under
An Oracle White Paper February 2010. Rapid Bottleneck Identification - A Better Way to do Load Testing
An Oracle White Paper February 2010 Rapid Bottleneck Identification - A Better Way to do Load Testing Introduction You re ready to launch a critical Web application. Ensuring good application performance
Mike Chyi, Micro Focus Solution Consultant May 12, 2010
Mike Chyi, Micro Focus Solution Consultant May 12, 2010 Agenda Load Testing Overview, Best Practice: Performance Testing with Diagnostics Demo (?), Q&A Load Testing Overview What is load testing? Type
Performance Analysis of Web based Applications on Single and Multi Core Servers
Performance Analysis of Web based Applications on Single and Multi Core Servers Gitika Khare, Diptikant Pathy, Alpana Rajan, Alok Jain, Anil Rawat Raja Ramanna Centre for Advanced Technology Department
IBM Lotus Notes and Lotus inotes 8.5.2 on Citrix XenApp 4.5/5.0: A scalability analysis
IBM Lotus Notes and Lotus inotes 8.5.2 on Citrix XenApp 4.5/5.0: A scalability analysis Gary Denner IBM Software Group IBM Collaboration Solutions Technical Lead - Lotus Domino SVT Mulhuddart, Ireland
ARIS Education Package Process Design & Analysis Installation Guide. Version 7.2. Installation Guide
ARIS Education Package Process Design & Analysis Installation Guide Version 7.2 Installation Guide March 2012 This publication is protected by international copyright law. All rights reserved. No part
Optimizing Shared Resource Contention in HPC Clusters
Optimizing Shared Resource Contention in HPC Clusters Sergey Blagodurov Simon Fraser University Alexandra Fedorova Simon Fraser University Abstract Contention for shared resources in HPC clusters occurs
white paper Capacity and Scaling of Microsoft Terminal Server on the Unisys ES7000/600 Unisys Systems & Technology Modeling and Measurement
white paper Capacity and Scaling of Microsoft Terminal Server on the Unisys ES7000/600 Unisys Systems & Technology Modeling and Measurement 2 This technical white paper has been written for IT professionals
Accelerate Testing Cycles With Collaborative Performance Testing
Accelerate Testing Cycles With Collaborative Performance Testing Sachin Dhamdhere 2005 Empirix, Inc. Agenda Introduction Tools Don t Collaborate Typical vs. Collaborative Test Execution Some Collaborative
Bringing Value to the Organization with Performance Testing
Bringing Value to the Organization with Performance Testing Michael Lawler NueVista Group 1 Today s Agenda Explore the benefits of a properly performed performance test Understand the basic elements of
Load Testing on Web Application using Automated Testing Tool: Load Complete
Load Testing on Web Application using Automated Testing Tool: Load Complete Neha Thakur, Dr. K.L. Bansal Research Scholar, Department of Computer Science, Himachal Pradesh University, Shimla, India Professor,
A Guide to Getting Started with Successful Load Testing
Ingenieurbüro David Fischer AG A Company of the Apica Group http://www.proxy-sniffer.com A Guide to Getting Started with Successful Load Testing English Edition 2007 All Rights Reserved Table of Contents
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any
Performance And Scalability In Oracle9i And SQL Server 2000
Performance And Scalability In Oracle9i And SQL Server 2000 Presented By : Phathisile Sibanda Supervisor : John Ebden 1 Presentation Overview Project Objectives Motivation -Why performance & Scalability
NetIQ Access Manager 4.1
White Paper NetIQ Access Manager 4.1 Performance and Sizing Guidelines Performance, Reliability, and Scalability Testing Revisions This table outlines all the changes that have been made to this document
Amazon EC2 XenApp Scalability Analysis
WHITE PAPER Citrix XenApp Amazon EC2 XenApp Scalability Analysis www.citrix.com Table of Contents Introduction...3 Results Summary...3 Detailed Results...4 Methods of Determining Results...4 Amazon EC2
QUALITYMATE FOR LOAD TESTING
QUALITYMATE FOR LOAD TESTING QualityMate suite of tools enables organizations to industrialize the software development process providing support for different process activities like Requirements Management,
Performance Testing Percy Pari Salas
Performance Testing Percy Pari Salas Presented by : Percy Pari Salas Agenda What is performance testing? Types of performance testing What does performance testing measure? Where does performance testing
Improved metrics collection and correlation for the CERN cloud storage test framework
Improved metrics collection and correlation for the CERN cloud storage test framework September 2013 Author: Carolina Lindqvist Supervisors: Maitane Zotes Seppo Heikkila CERN openlab Summer Student Report
Page 1. Overview of System Architecture
Page 1 Contents Introduction to the HR.net Enterprise Framework 1 HR.net Enterprise Administrator Console 3 HR.net Enterprise Document Explorer 4 HR.net Enterprise Server Application 4 HR.net Enterprise
RSM Web Gateway RSM Web Client INSTALLATION AND ADMINISTRATION GUIDE
RSM Web Gateway RSM Web Client INSTALLATION AND ADMINISTRATION GUIDE Installation and Administration Guide RSM Web Client and RSM Web Gateway 17 August, 2004 Page 1 Copyright Notice 2004 Sony Corporation.
Getting Started with HC Exchange Module
Getting Started with HC Exchange Module HOSTING CONTROLLER WWW.HOSTINGCONROLLER.COM HOSTING CONTROLLER Contents Introduction...1 Minimum System Requirements for Exchange 2013...1 Hardware Requirements...1
SPAMfighter Exchange Module
SPAMfighter Exchange Module For Microsoft Exchange Server 2000 and 2003. White Paper July 2004. Copyright 2004 by SPAMfighter ApS. All rights reserved. SPAMfighter Exchange Module Page 1 of 10 Table of
Performance brief for IBM WebSphere Application Server 7.0 with VMware ESX 4.0 on HP ProLiant DL380 G6 server
Performance brief for IBM WebSphere Application Server.0 with VMware ESX.0 on HP ProLiant DL0 G server Table of contents Executive summary... WebSphere test configuration... Server information... WebSphere
Rapid Bottleneck Identification A Better Way to do Load Testing. An Oracle White Paper June 2009
Rapid Bottleneck Identification A Better Way to do Load Testing An Oracle White Paper June 2009 Rapid Bottleneck Identification A Better Way to do Load Testing. RBI combines a comprehensive understanding
Tableau Server 7.0 scalability
Tableau Server 7.0 scalability February 2012 p2 Executive summary In January 2012, we performed scalability tests on Tableau Server to help our customers plan for large deployments. We tested three different
XAP 10 Global HTTP Session Sharing
XAP 10 Global HTTP Session Sharing Sep 2014 Shay Hassidim Deputy CTO, Distinguished Engineer 1 Agenda Agenda Web Application Challenges Introduce XAP Global HTTP Session Sharing Application Session Sharing
Performance Test Process
A white Success The performance testing helped the client identify and resolve performance bottlenecks which otherwise crippled the business. The ability to support 500 concurrent users was a performance
SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, 2013. Submit and Approval Phase Results
SOLUTION BRIEF: SLCM R12.8 PERFORMANCE TEST RESULTS JANUARY, 2013 Submit and Approval Phase Results Table of Contents Executive Summary 3 Test Environment 4 Server Topology 4 CA Service Catalog Settings
Noelle A. Stimely Senior Performance Test Engineer. University of California, San Francisco [email protected]
Noelle A. Stimely Senior Performance Test Engineer University of California, San Francisco [email protected] Who am I? Senior Oracle Database Administrator for over 13 years Senior Performance Test
Performance Testing. Why is important? An introduction. Why is important? Delivering Excellence in Software Engineering
Delivering Excellence in Software Engineering Performance Testing An introduction. Why is important? Why is important? 2 1 https://www.youtube.com/watch?v=8y8vqjqbqdc 3 4 2 Introduction Why is important?
Dell Enterprise Reporter 2.5. Configuration Manager User Guide
Dell Enterprise Reporter 2.5 2014 Dell Inc. ALL RIGHTS RESERVED. This guide contains proprietary information protected by copyright. The software described in this guide is furnished under a software license
Load/Performance Test Plan
[Project_name_here] Load/Performance Test Plan Version [Version_number] Author: [Your_name_here] [Your_Company_name] [Street_name_1] [Street_name_2] [City_Zip_Country] [Phone_number] [URL] Audit Trail:
A Scalability Study for WebSphere Application Server and DB2 Universal Database
A Scalability Study for WebSphere Application and DB2 Universal Database By Yongli An, Tsz Kin Tony Lau, and Peter Shum DB2 Universal Database Performance & Advanced Technology IBM Toronto Lab, IBM Canada
Performance White Paper
Sitecore Experience Platform 8.1 Performance White Paper Rev: March 11, 2016 Sitecore Experience Platform 8.1 Performance White Paper Sitecore Experience Platform 8.1 Table of contents Table of contents...
The Association of System Performance Professionals
The Association of System Performance Professionals The Computer Measurement Group, commonly called CMG, is a not for profit, worldwide organization of data processing professionals committed to the measurement
Module 11 Setting up Customization Environment
Module 11 Setting up Customization Environment By Kitti Upariphutthiphong Technical Consultant, ecosoft [email protected] ADempiere ERP 1 2 Module Objectives Downloading ADempiere Source Code Setup Development
Successful Factors for Performance Testing Projects. NaveenKumar Namachivayam - Founder - QAInsights
Successful Factors for Performance Testing Projects NaveenKumar Namachivayam - Founder - QAInsights Contents Introduction... 2 Planning... 3 Staffing... 5 Test Environments... 7 Support... 8 Intuitive
Envox CDP 7.0 Performance Comparison of VoiceXML and Envox Scripts
Envox CDP 7.0 Performance Comparison of and Envox Scripts Goal and Conclusion The focus of the testing was to compare the performance of and ENS applications. It was found that and ENS applications have
SSM6437 DESIGNING A WINDOWS SERVER 2008 APPLICATIONS INFRASTRUCTURE
SSM6437 DESIGNING A WINDOWS SERVER 2008 APPLICATIONS INFRASTRUCTURE Duration 5 Days Course Outline Module 1: Designing IIS Web Farms The students will learn the process of designing IIS Web Farms with
Process of Performance Testing a Banking Application
Process of Performance Testing a Banking Application The Need One of the largest banks in India offers Online and Mobile Banking application to its customers. Several online services are hassle-free services
Microsoft SQL Server: MS-10980 Performance Tuning and Optimization Digital
coursemonster.com/us Microsoft SQL Server: MS-10980 Performance Tuning and Optimization Digital View training dates» Overview This course is designed to give the right amount of Internals knowledge and
Rapid Bottleneck Identification
Rapid Bottleneck Identification TM A Better Way to Load Test WHITEPAPER You re getting ready to launch or upgrade a critical Web application. Quality is crucial, but time is short. How can you make the
Enterprise Server. Application Sentinel for SQL Server Installation and Configuration Guide. Application Sentinel 2.0 and Higher
Enterprise Server Application Sentinel for SQL Server Installation and Configuration Guide Application Sentinel 2.0 and Higher August 2004 Printed in USA 3832 1097 000 . Enterprise Server Application Sentinel
AVG File Server. User Manual. Document revision 2015.08 (23.3.2015)
AVG File Server User Manual Document revision 2015.08 (23.3.2015) C opyright AVG Technologies C Z, s.r.o. All rights reserved. All other trademarks are the property of their respective owners. Contents
System Requirements Table of contents
Table of contents 1 Introduction... 2 2 Knoa Agent... 2 2.1 System Requirements...2 2.2 Environment Requirements...4 3 Knoa Server Architecture...4 3.1 Knoa Server Components... 4 3.2 Server Hardware Setup...5
How to Configure a Stress Test Project for Microsoft Office SharePoint Server 2007 using Visual Studio Team Suite 2008.
How to Configure a Stress Test Project for Microsoft Office SharePoint Server 2007 using Visual Studio Team Suite 2008. 1 By Steve Smith, MVP SharePoint Server, MCT And Penny Coventry, MVP SharePoint Server,
Pcounter Web Report 3.x Installation Guide - v2014-11-30. Pcounter Web Report Installation Guide Version 3.4
Pcounter Web Report 3.x Installation Guide - v2014-11-30 Pcounter Web Report Installation Guide Version 3.4 Table of Contents Table of Contents... 2 Installation Overview... 3 Installation Prerequisites
PARALLELS CLOUD SERVER
PARALLELS CLOUD SERVER Performance and Scalability 1 Table of Contents Executive Summary... Error! Bookmark not defined. LAMP Stack Performance Evaluation... Error! Bookmark not defined. Background...
Business Application Services Testing
Business Application Services Testing Curriculum Structure Course name Duration(days) Express 2 Testing Concept and methodologies 3 Introduction to Performance Testing 3 Web Testing 2 QTP 5 SQL 5 Load
SQL Server Performance Tuning and Optimization
3 Riverchase Office Plaza Hoover, Alabama 35244 Phone: 205.989.4944 Fax: 855.317.2187 E-Mail: [email protected] Web: www.discoveritt.com SQL Server Performance Tuning and Optimization Course: MS10980A
DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering
DELL Virtual Desktop Infrastructure Study END-TO-END COMPUTING Dell Enterprise Solutions Engineering 1 THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL ERRORS AND TECHNICAL
JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers
JBoss Seam Performance and Scalability on Dell PowerEdge 1855 Blade Servers Dave Jaffe, PhD, Dell Inc. Michael Yuan, PhD, JBoss / RedHat June 14th, 2006 JBoss Inc. 2006 About us Dave Jaffe Works for Dell
Guideline for stresstest Page 1 of 6. Stress test
Guideline for stresstest Page 1 of 6 Stress test Objective: Show unacceptable problems with high parallel load. Crash, wrong processing, slow processing. Test Procedure: Run test cases with maximum number
How To Test For Performance
: Roles, Activities, and QA Inclusion Michael Lawler NueVista Group 1 Today s Agenda Outline the components of a performance test and considerations Discuss various roles, tasks, and activities Review
Fixed Price Website Load Testing
Fixed Price Website Load Testing Can your website handle the load? Don t be the last one to know. For as low as $4,500, and in many cases within one week, we can remotely load test your website and report
How To Install An Aneka Cloud On A Windows 7 Computer (For Free)
MANJRASOFT PTY LTD Aneka 3.0 Manjrasoft 5/13/2013 This document describes in detail the steps involved in installing and configuring an Aneka Cloud. It covers the prerequisites for the installation, the
